Feb 27 00:05:29 crc systemd[1]: Starting Kubernetes Kubelet... Feb 27 00:05:29 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:29 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 00:05:30 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 27 00:05:31 crc kubenswrapper[4781]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.088697 4781 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095910 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095942 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095953 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095962 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095972 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095982 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.095992 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096002 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096012 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096021 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096048 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096059 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096069 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096077 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096088 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096097 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096106 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096114 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096121 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096129 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096136 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096145 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096153 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096161 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096171 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096182 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096190 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096198 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096207 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096214 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096222 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096230 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096238 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096245 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096253 4781 feature_gate.go:330] unrecognized feature gate: Example Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096260 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096269 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096277 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096284 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096292 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096301 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096309 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096317 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096324 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096333 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096340 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096360 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096370 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096378 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096387 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096394 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096402 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096410 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096418 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096426 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096434 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096442 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096450 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096458 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096465 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096473 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096480 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096488 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096496 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096507 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096518 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096526 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096534 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096543 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096551 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.096560 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098199 4781 flags.go:64] FLAG: --address="0.0.0.0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098228 4781 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098250 4781 flags.go:64] FLAG: --anonymous-auth="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098261 4781 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098272 4781 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098283 4781 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098294 4781 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098305 4781 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098315 4781 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098326 4781 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098335 4781 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098357 4781 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098367 4781 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098376 4781 flags.go:64] FLAG: --cgroup-root="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098384 4781 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098394 4781 flags.go:64] FLAG: --client-ca-file="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098402 4781 flags.go:64] FLAG: --cloud-config="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098411 4781 flags.go:64] FLAG: --cloud-provider="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098420 4781 flags.go:64] FLAG: --cluster-dns="[]" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098437 4781 flags.go:64] FLAG: --cluster-domain="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098446 4781 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098455 4781 flags.go:64] FLAG: --config-dir="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098464 4781 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098474 4781 flags.go:64] FLAG: --container-log-max-files="5" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098485 4781 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098494 4781 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098503 4781 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098513 4781 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098522 4781 flags.go:64] FLAG: --contention-profiling="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098531 4781 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098539 4781 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098549 4781 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098558 4781 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098569 4781 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098578 4781 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098587 4781 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098620 4781 flags.go:64] FLAG: --enable-load-reader="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098659 4781 flags.go:64] FLAG: --enable-server="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098668 4781 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098685 4781 flags.go:64] FLAG: --event-burst="100" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098695 4781 flags.go:64] FLAG: --event-qps="50" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098704 4781 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098713 4781 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098722 4781 flags.go:64] FLAG: --eviction-hard="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098733 4781 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098742 4781 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098751 4781 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098772 4781 flags.go:64] FLAG: --eviction-soft="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098781 4781 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098790 4781 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098800 4781 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098808 4781 flags.go:64] FLAG: --experimental-mounter-path="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098817 4781 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098826 4781 flags.go:64] FLAG: --fail-swap-on="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098834 4781 flags.go:64] FLAG: --feature-gates="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098845 4781 flags.go:64] FLAG: --file-check-frequency="20s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098854 4781 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098863 4781 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098872 4781 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098884 4781 flags.go:64] FLAG: --healthz-port="10248" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098893 4781 flags.go:64] FLAG: --help="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098902 4781 flags.go:64] FLAG: --hostname-override="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098910 4781 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098920 4781 flags.go:64] FLAG: --http-check-frequency="20s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098929 4781 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098938 4781 flags.go:64] FLAG: --image-credential-provider-config="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098947 4781 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098956 4781 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098965 4781 flags.go:64] FLAG: --image-service-endpoint="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098973 4781 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098982 4781 flags.go:64] FLAG: --kube-api-burst="100" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.098991 4781 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099000 4781 flags.go:64] FLAG: --kube-api-qps="50" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099009 4781 flags.go:64] FLAG: --kube-reserved="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099018 4781 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099026 4781 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099036 4781 flags.go:64] FLAG: --kubelet-cgroups="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099044 4781 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099053 4781 flags.go:64] FLAG: --lock-file="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099062 4781 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099071 4781 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099080 4781 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099093 4781 flags.go:64] FLAG: --log-json-split-stream="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099113 4781 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099123 4781 flags.go:64] FLAG: --log-text-split-stream="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099131 4781 flags.go:64] FLAG: --logging-format="text" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099140 4781 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099150 4781 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099159 4781 flags.go:64] FLAG: --manifest-url="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099168 4781 flags.go:64] FLAG: --manifest-url-header="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099179 4781 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099189 4781 flags.go:64] FLAG: --max-open-files="1000000" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099204 4781 flags.go:64] FLAG: --max-pods="110" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099213 4781 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099222 4781 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099231 4781 flags.go:64] FLAG: --memory-manager-policy="None" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099239 4781 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099249 4781 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099257 4781 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099266 4781 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099285 4781 flags.go:64] FLAG: --node-status-max-images="50" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099294 4781 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099303 4781 flags.go:64] FLAG: --oom-score-adj="-999" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099312 4781 flags.go:64] FLAG: --pod-cidr="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099321 4781 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099334 4781 flags.go:64] FLAG: --pod-manifest-path="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099342 4781 flags.go:64] FLAG: --pod-max-pids="-1" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099352 4781 flags.go:64] FLAG: --pods-per-core="0" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099360 4781 flags.go:64] FLAG: --port="10250" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099369 4781 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099378 4781 flags.go:64] FLAG: --provider-id="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099387 4781 flags.go:64] FLAG: --qos-reserved="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099396 4781 flags.go:64] FLAG: --read-only-port="10255" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099405 4781 flags.go:64] FLAG: --register-node="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099413 4781 flags.go:64] FLAG: --register-schedulable="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099488 4781 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099504 4781 flags.go:64] FLAG: --registry-burst="10" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099514 4781 flags.go:64] FLAG: --registry-qps="5" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099523 4781 flags.go:64] FLAG: --reserved-cpus="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099544 4781 flags.go:64] FLAG: --reserved-memory="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099558 4781 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099567 4781 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099576 4781 flags.go:64] FLAG: --rotate-certificates="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099585 4781 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099594 4781 flags.go:64] FLAG: --runonce="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099604 4781 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099613 4781 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099669 4781 flags.go:64] FLAG: --seccomp-default="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099680 4781 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099689 4781 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099698 4781 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099707 4781 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099716 4781 flags.go:64] FLAG: --storage-driver-password="root" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099725 4781 flags.go:64] FLAG: --storage-driver-secure="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099734 4781 flags.go:64] FLAG: --storage-driver-table="stats" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099743 4781 flags.go:64] FLAG: --storage-driver-user="root" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099751 4781 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099760 4781 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099770 4781 flags.go:64] FLAG: --system-cgroups="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099778 4781 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099792 4781 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099801 4781 flags.go:64] FLAG: --tls-cert-file="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099809 4781 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099830 4781 flags.go:64] FLAG: --tls-min-version="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099838 4781 flags.go:64] FLAG: --tls-private-key-file="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099847 4781 flags.go:64] FLAG: --topology-manager-policy="none" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099855 4781 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099864 4781 flags.go:64] FLAG: --topology-manager-scope="container" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099873 4781 flags.go:64] FLAG: --v="2" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099884 4781 flags.go:64] FLAG: --version="false" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099895 4781 flags.go:64] FLAG: --vmodule="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099906 4781 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.099915 4781 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100192 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100203 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100223 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100233 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100242 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100251 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100260 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100268 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100276 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100284 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100292 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100300 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100310 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100319 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100328 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100336 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100344 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100352 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100360 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100367 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100377 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100386 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100395 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100403 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100410 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100418 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100425 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100434 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100442 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100449 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100457 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100465 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100473 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100481 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100489 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100500 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100509 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100517 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100537 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100546 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100555 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100563 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100571 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100579 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100587 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100595 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100603 4781 feature_gate.go:330] unrecognized feature gate: Example Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100611 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100620 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100651 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100659 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100667 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100675 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100683 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100690 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100698 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100706 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100716 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100727 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100735 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100744 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100752 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100760 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100768 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100777 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100784 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100792 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100801 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100809 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100817 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.100827 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.101540 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.114176 4781 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.114222 4781 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114327 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114337 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114343 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114349 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114354 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114360 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114365 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114371 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114376 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114381 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114386 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114391 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114397 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114409 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114415 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114420 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114425 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114431 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114436 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114441 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114446 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114453 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114464 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114472 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114478 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114484 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114489 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114495 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114501 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114507 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114512 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114517 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114522 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114527 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114532 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114538 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114543 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114548 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114553 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114558 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114564 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114569 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114574 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114618 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114654 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114659 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114664 4781 feature_gate.go:330] unrecognized feature gate: Example Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114669 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114675 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114681 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114686 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114699 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114705 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114710 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114715 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114720 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114725 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114731 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114737 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114742 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114748 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114755 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114762 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114769 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114776 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114782 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114788 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114793 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114799 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114804 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.114810 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.114819 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115011 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115029 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115036 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115041 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115047 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115052 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115057 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115063 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115068 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115076 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115083 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115088 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115096 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115101 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115106 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115111 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115117 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115122 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115127 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115132 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115139 4781 feature_gate.go:330] unrecognized feature gate: Example Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115146 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115152 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115157 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115162 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115167 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115172 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115177 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115183 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115188 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115193 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115198 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115204 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115209 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115216 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115222 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115228 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115242 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115248 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115254 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115259 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115264 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115270 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115276 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115281 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115286 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115291 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115297 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115302 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115307 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115314 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115320 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115326 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115332 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115338 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115344 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115349 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115355 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115360 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115365 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115370 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115375 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115380 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115386 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115392 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115397 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115402 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115407 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115412 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115418 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.115423 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.115432 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.115657 4781 server.go:940] "Client rotation is on, will bootstrap in background" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.120304 4781 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.125289 4781 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.125473 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.127517 4781 server.go:997] "Starting client certificate rotation" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.127571 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.127753 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.151817 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.153662 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.156034 4781 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.171045 4781 log.go:25] "Validated CRI v1 runtime API" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.204565 4781 log.go:25] "Validated CRI v1 image API" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.206182 4781 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.210568 4781 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-27-00-00-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.210614 4781 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.234230 4781 manager.go:217] Machine: {Timestamp:2026-02-27 00:05:31.231196649 +0000 UTC m=+0.488736233 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:673e9d6f-5525-49c7-9d73-70585e17af5f BootID:6d1802c0-d9dd-4bd7-99c8-bbe950fe4246 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:35:72:4f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:35:72:4f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1c:29:ea Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:db:5d:02 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b5:0d:dc Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:47:6f:ef Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:e1:5b:24:e6:1b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:b0:c6:9e:12:6e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.234526 4781 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.234794 4781 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.236106 4781 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.236387 4781 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.236421 4781 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.236729 4781 topology_manager.go:138] "Creating topology manager with none policy" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.236742 4781 container_manager_linux.go:303] "Creating device plugin manager" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.237077 4781 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.237112 4781 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.237340 4781 state_mem.go:36] "Initialized new in-memory state store" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.237426 4781 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.240394 4781 kubelet.go:418] "Attempting to sync node with API server" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.240417 4781 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.240446 4781 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.240668 4781 kubelet.go:324] "Adding apiserver pod source" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.240714 4781 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.245045 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.245215 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.245317 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.245352 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.245421 4781 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.246205 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.248234 4781 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249649 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249673 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249683 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249692 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249706 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249714 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249722 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249735 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249745 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249764 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249781 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.249790 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.250727 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.251146 4781 server.go:1280] "Started kubelet" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.252363 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.252558 4781 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.252561 4781 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 27 00:05:31 crc systemd[1]: Started Kubernetes Kubelet. Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253011 4781 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253283 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253322 4781 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253610 4781 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253642 4781 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.253712 4781 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.254128 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.254180 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.253545 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.260375 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.262847 4781 factory.go:55] Registering systemd factory Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.262878 4781 factory.go:221] Registration of the systemd container factory successfully Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263072 4781 server.go:460] "Adding debug handlers to kubelet server" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263254 4781 factory.go:153] Registering CRI-O factory Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263293 4781 factory.go:221] Registration of the crio container factory successfully Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263434 4781 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263477 4781 factory.go:103] Registering Raw factory Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.263513 4781 manager.go:1196] Started watching for new ooms in manager Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.264672 4781 manager.go:319] Starting recovery of all containers Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.262363 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269395 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269430 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269440 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269449 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269458 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269466 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269475 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269483 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269493 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269501 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269509 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269518 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269531 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269541 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269551 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269560 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269569 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269578 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269586 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269595 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269603 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269612 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269620 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269647 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269657 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269666 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269678 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269698 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269707 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269716 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269743 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269752 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269761 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269770 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269778 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269788 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269796 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269805 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269813 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269822 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269849 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269858 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269867 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269875 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269884 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269892 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269920 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269928 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269937 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269944 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269953 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269961 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269973 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269983 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269991 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.269999 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270147 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270162 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270171 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270181 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270189 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270225 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270234 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270242 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270266 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270276 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270285 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270294 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270313 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270322 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270330 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270338 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270348 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270357 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270367 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270376 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270385 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270394 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270404 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270413 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270421 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270431 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.270440 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.272730 4781 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.272784 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.272801 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273785 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273825 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273842 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273854 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273867 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273878 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273891 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273907 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273920 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273932 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273949 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273962 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273973 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.273986 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274000 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274013 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274610 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274656 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274674 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274700 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274718 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274736 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274752 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274769 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274785 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274800 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274815 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274830 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274848 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274862 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274875 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274889 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.274902 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275797 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275826 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275838 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275849 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275860 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275870 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275881 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275890 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275899 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275909 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275921 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275931 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275941 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275951 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275961 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275972 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275982 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.275998 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276007 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276018 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276028 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276038 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276048 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276058 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276069 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276079 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276089 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276099 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276109 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276119 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276129 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276138 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276148 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276158 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276168 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276178 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276188 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276198 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276207 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276217 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276227 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276237 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276250 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276259 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276270 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276280 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276290 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276299 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276309 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276319 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276329 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276339 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276348 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276359 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276368 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276378 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276388 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276399 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276408 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276418 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276428 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276437 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276446 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276456 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276467 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276477 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276487 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276498 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276508 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276517 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276528 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276537 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276547 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276556 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276568 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276577 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276593 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276602 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276614 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276635 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276645 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276656 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276665 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276676 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276686 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276696 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276706 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276716 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.276725 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.277481 4781 reconstruct.go:97] "Volume reconstruction finished" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.277510 4781 reconciler.go:26] "Reconciler: start to sync state" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.285232 4781 manager.go:324] Recovery completed Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.299570 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.301429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.301456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.301464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.303963 4781 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.303979 4781 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.304005 4781 state_mem.go:36] "Initialized new in-memory state store" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.305943 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.308031 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.308084 4781 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.308107 4781 kubelet.go:2335] "Starting kubelet main sync loop" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.308155 4781 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.311541 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.311643 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.326130 4781 policy_none.go:49] "None policy: Start" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.326942 4781 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.326972 4781 state_mem.go:35] "Initializing new in-memory state store" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.354471 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.376705 4781 manager.go:334] "Starting Device Plugin manager" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.376776 4781 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.376792 4781 server.go:79] "Starting device plugin registration server" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.377270 4781 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.377289 4781 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.377411 4781 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.377549 4781 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.377564 4781 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.385145 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.408929 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.409005 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.409712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.409740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.409749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.409890 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410274 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410665 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410799 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.410854 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411773 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411893 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.411999 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412031 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412615 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.412678 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413412 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413428 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413566 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.413592 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.414228 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.414251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.414261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.461485 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.477675 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.478722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.478752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.478762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.478786 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.479240 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479373 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479413 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479484 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479515 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479697 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479755 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.479848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580749 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580813 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580840 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580899 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580990 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.580922 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581541 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.581513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.679730 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.680862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.680892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.680902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.680921 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.681249 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.739833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.747936 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.769618 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.785363 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e5dfccd0ffb945e9f57a73497320c403a3e6669bb44408fdcdb08d4cfe8304b3 WatchSource:0}: Error finding container e5dfccd0ffb945e9f57a73497320c403a3e6669bb44408fdcdb08d4cfe8304b3: Status 404 returned error can't find the container with id e5dfccd0ffb945e9f57a73497320c403a3e6669bb44408fdcdb08d4cfe8304b3 Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.786159 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-851e59b83b38b6e3fe5361c3c7767cf3e680281bd077d9debecf26454227d2f5 WatchSource:0}: Error finding container 851e59b83b38b6e3fe5361c3c7767cf3e680281bd077d9debecf26454227d2f5: Status 404 returned error can't find the container with id 851e59b83b38b6e3fe5361c3c7767cf3e680281bd077d9debecf26454227d2f5 Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.786192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: I0227 00:05:31.791531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.792052 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-027824cdc31fd378422407859ab191ad82aa297d50397df05a014ba8d23b0192 WatchSource:0}: Error finding container 027824cdc31fd378422407859ab191ad82aa297d50397df05a014ba8d23b0192: Status 404 returned error can't find the container with id 027824cdc31fd378422407859ab191ad82aa297d50397df05a014ba8d23b0192 Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.796365 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-84c64d90bbe6500df0e7a5ce43cca9f93887495042e33082c13305fa31cd7035 WatchSource:0}: Error finding container 84c64d90bbe6500df0e7a5ce43cca9f93887495042e33082c13305fa31cd7035: Status 404 returned error can't find the container with id 84c64d90bbe6500df0e7a5ce43cca9f93887495042e33082c13305fa31cd7035 Feb 27 00:05:31 crc kubenswrapper[4781]: W0227 00:05:31.806574 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b6b11811ab566c6e2c309c7f951676085cec0f1539e20d814f2bf5498ce2ab72 WatchSource:0}: Error finding container b6b11811ab566c6e2c309c7f951676085cec0f1539e20d814f2bf5498ce2ab72: Status 404 returned error can't find the container with id b6b11811ab566c6e2c309c7f951676085cec0f1539e20d814f2bf5498ce2ab72 Feb 27 00:05:31 crc kubenswrapper[4781]: E0227 00:05:31.862360 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.081607 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.083242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.083294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.083307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.083338 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.083851 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.253464 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:32 crc kubenswrapper[4781]: W0227 00:05:32.291782 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.291849 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.313835 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b6b11811ab566c6e2c309c7f951676085cec0f1539e20d814f2bf5498ce2ab72"} Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.314939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"84c64d90bbe6500df0e7a5ce43cca9f93887495042e33082c13305fa31cd7035"} Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.315850 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"027824cdc31fd378422407859ab191ad82aa297d50397df05a014ba8d23b0192"} Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.316704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5dfccd0ffb945e9f57a73497320c403a3e6669bb44408fdcdb08d4cfe8304b3"} Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.317683 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"851e59b83b38b6e3fe5361c3c7767cf3e680281bd077d9debecf26454227d2f5"} Feb 27 00:05:32 crc kubenswrapper[4781]: W0227 00:05:32.540112 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.540173 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:32 crc kubenswrapper[4781]: W0227 00:05:32.622594 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.622683 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:32 crc kubenswrapper[4781]: W0227 00:05:32.644351 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.644436 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.663304 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.884094 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.885350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.885381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.885409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:32 crc kubenswrapper[4781]: I0227 00:05:32.885427 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:32 crc kubenswrapper[4781]: E0227 00:05:32.885768 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.225079 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:05:33 crc kubenswrapper[4781]: E0227 00:05:33.226416 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.253591 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.322339 4781 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="88da3d10bf5da71b6475cfb7ddebf8cac90fca8bdfa6b4680b7591976ec59877" exitCode=0 Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.322449 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.322464 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"88da3d10bf5da71b6475cfb7ddebf8cac90fca8bdfa6b4680b7591976ec59877"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.323480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.323513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.323532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.324462 4781 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b" exitCode=0 Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.324514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.324556 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.325517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.325543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.325552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.327451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.327480 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.327490 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.327549 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.327500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.328486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.328525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.328536 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.330030 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" exitCode=0 Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.330070 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.330195 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.331515 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b" exitCode=0 Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.331578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b"} Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.331766 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332929 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.332972 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.334709 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.335708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.335729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:33 crc kubenswrapper[4781]: I0227 00:05:33.335739 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.253426 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:34 crc kubenswrapper[4781]: E0227 00:05:34.263823 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.337039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"47a456d85ecc51c763dbf48df62ed7e6f3fdfff351863ff01f677d4a4b9d554e"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.337127 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.337974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.338000 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.338011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.339805 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.339832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.339845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.339909 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.340564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.340586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.340594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.342895 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.342925 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.342934 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.342942 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.344170 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917" exitCode=0 Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.344248 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.344645 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917"} Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.344686 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.345406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.486337 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.487769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.487822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.487841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:34 crc kubenswrapper[4781]: I0227 00:05:34.487876 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:34 crc kubenswrapper[4781]: E0227 00:05:34.488506 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 27 00:05:34 crc kubenswrapper[4781]: W0227 00:05:34.506840 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 27 00:05:34 crc kubenswrapper[4781]: E0227 00:05:34.506911 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.002205 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.010083 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.270810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.354540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f7e00ac2d8b991ebe6208d7af11141a340ff54ad577b6c586c36828a042644b"} Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.354726 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.356120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.356165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.356186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358619 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad" exitCode=0 Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad"} Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358771 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358826 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358875 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.358941 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360918 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:35 crc kubenswrapper[4781]: I0227 00:05:35.360972 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53"} Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366510 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308"} Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc"} Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b"} Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366561 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366576 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366473 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366672 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.366589 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368151 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.368279 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.567533 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:36 crc kubenswrapper[4781]: I0227 00:05:36.758768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.366665 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.375143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7"} Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.375225 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.375299 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.375306 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.375503 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377001 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377274 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.377347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.538807 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.688824 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.690026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.690054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.690063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:37 crc kubenswrapper[4781]: I0227 00:05:37.690081 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.377399 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.377432 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:38 crc kubenswrapper[4781]: I0227 00:05:38.378742 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.217879 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.218137 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.219930 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.219980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.219998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.379535 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.380599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.380674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:39 crc kubenswrapper[4781]: I0227 00:05:39.380687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.649817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.649994 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.651262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.651380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.651401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:40 crc kubenswrapper[4781]: I0227 00:05:40.919157 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 27 00:05:41 crc kubenswrapper[4781]: I0227 00:05:41.384606 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:41 crc kubenswrapper[4781]: E0227 00:05:41.385402 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:05:41 crc kubenswrapper[4781]: I0227 00:05:41.386045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:41 crc kubenswrapper[4781]: I0227 00:05:41.386078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:41 crc kubenswrapper[4781]: I0227 00:05:41.386095 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.283689 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.284399 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.285881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.286021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.286203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.291909 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.389808 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.390882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.390914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:43 crc kubenswrapper[4781]: I0227 00:05:43.390926 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:44 crc kubenswrapper[4781]: W0227 00:05:44.910134 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 00:05:44 crc kubenswrapper[4781]: I0227 00:05:44.910259 4781 trace.go:236] Trace[411930986]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 00:05:34.908) (total time: 10001ms): Feb 27 00:05:44 crc kubenswrapper[4781]: Trace[411930986]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:05:44.910) Feb 27 00:05:44 crc kubenswrapper[4781]: Trace[411930986]: [10.001256871s] [10.001256871s] END Feb 27 00:05:44 crc kubenswrapper[4781]: E0227 00:05:44.910287 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 00:05:45 crc kubenswrapper[4781]: W0227 00:05:45.127534 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.127618 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.128405 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z Feb 27 00:05:45 crc kubenswrapper[4781]: W0227 00:05:45.129143 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.129407 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.131655 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.133376 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 27 00:05:45 crc kubenswrapper[4781]: W0227 00:05:45.135894 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.136125 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.136368 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:05:45 crc kubenswrapper[4781]: E0227 00:05:45.137563 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.143162 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.143391 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.148059 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.148348 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 00:05:45 crc kubenswrapper[4781]: I0227 00:05:45.255786 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:45Z is after 2026-02-23T05:33:13Z Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.256791 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:46Z is after 2026-02-23T05:33:13Z Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.284058 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.284126 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.397201 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.398892 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f7e00ac2d8b991ebe6208d7af11141a340ff54ad577b6c586c36828a042644b" exitCode=255 Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.398932 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f7e00ac2d8b991ebe6208d7af11141a340ff54ad577b6c586c36828a042644b"} Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.399083 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.399890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.399949 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.399968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.400833 4781 scope.go:117] "RemoveContainer" containerID="7f7e00ac2d8b991ebe6208d7af11141a340ff54ad577b6c586c36828a042644b" Feb 27 00:05:46 crc kubenswrapper[4781]: I0227 00:05:46.577163 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.257784 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:47Z is after 2026-02-23T05:33:13Z Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.405403 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.406527 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.409304 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" exitCode=255 Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.409367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1"} Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.409445 4781 scope.go:117] "RemoveContainer" containerID="7f7e00ac2d8b991ebe6208d7af11141a340ff54ad577b6c586c36828a042644b" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.409710 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.410885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.410941 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.410966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.412070 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:05:47 crc kubenswrapper[4781]: E0227 00:05:47.412467 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:05:47 crc kubenswrapper[4781]: I0227 00:05:47.419840 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.257573 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:48Z is after 2026-02-23T05:33:13Z Feb 27 00:05:48 crc kubenswrapper[4781]: W0227 00:05:48.343688 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:48Z is after 2026-02-23T05:33:13Z Feb 27 00:05:48 crc kubenswrapper[4781]: E0227 00:05:48.343773 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.414480 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.417867 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.419342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.419410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.419426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.420612 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:05:48 crc kubenswrapper[4781]: E0227 00:05:48.420945 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:05:48 crc kubenswrapper[4781]: W0227 00:05:48.574870 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:48Z is after 2026-02-23T05:33:13Z Feb 27 00:05:48 crc kubenswrapper[4781]: E0227 00:05:48.575031 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:48 crc kubenswrapper[4781]: I0227 00:05:48.704776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.257169 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:49Z is after 2026-02-23T05:33:13Z Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.420808 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.422172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.422226 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.422246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:49 crc kubenswrapper[4781]: I0227 00:05:49.423146 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:05:49 crc kubenswrapper[4781]: E0227 00:05:49.423528 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.256238 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:50Z is after 2026-02-23T05:33:13Z Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.422903 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.423818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.423879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.423887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.424311 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:05:50 crc kubenswrapper[4781]: E0227 00:05:50.424444 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.957730 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.957969 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.959449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.959530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.959556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:50 crc kubenswrapper[4781]: I0227 00:05:50.973245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 27 00:05:50 crc kubenswrapper[4781]: W0227 00:05:50.978161 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:50Z is after 2026-02-23T05:33:13Z Feb 27 00:05:50 crc kubenswrapper[4781]: E0227 00:05:50.978281 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.258232 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:51Z is after 2026-02-23T05:33:13Z Feb 27 00:05:51 crc kubenswrapper[4781]: E0227 00:05:51.385600 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.425467 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.427234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.427341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.427383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.539113 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:51 crc kubenswrapper[4781]: E0227 00:05:51.539193 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:51Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.540679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.540727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.540741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:51 crc kubenswrapper[4781]: I0227 00:05:51.540775 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:51 crc kubenswrapper[4781]: E0227 00:05:51.545577 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:51Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:05:52 crc kubenswrapper[4781]: I0227 00:05:52.258429 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:52Z is after 2026-02-23T05:33:13Z Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.258459 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:53Z is after 2026-02-23T05:33:13Z Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.282735 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.282931 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.284203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.284239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.284253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.284849 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:05:53 crc kubenswrapper[4781]: E0227 00:05:53.285033 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:05:53 crc kubenswrapper[4781]: I0227 00:05:53.886135 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:05:53 crc kubenswrapper[4781]: E0227 00:05:53.891442 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:54 crc kubenswrapper[4781]: I0227 00:05:54.258169 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:54Z is after 2026-02-23T05:33:13Z Feb 27 00:05:55 crc kubenswrapper[4781]: E0227 00:05:55.142560 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:05:55 crc kubenswrapper[4781]: I0227 00:05:55.258668 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:55Z is after 2026-02-23T05:33:13Z Feb 27 00:05:56 crc kubenswrapper[4781]: I0227 00:05:56.257258 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:56Z is after 2026-02-23T05:33:13Z Feb 27 00:05:56 crc kubenswrapper[4781]: I0227 00:05:56.284689 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:05:56 crc kubenswrapper[4781]: I0227 00:05:56.284764 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:05:56 crc kubenswrapper[4781]: W0227 00:05:56.982496 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:56Z is after 2026-02-23T05:33:13Z Feb 27 00:05:56 crc kubenswrapper[4781]: E0227 00:05:56.982563 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:57 crc kubenswrapper[4781]: I0227 00:05:57.258248 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:57Z is after 2026-02-23T05:33:13Z Feb 27 00:05:57 crc kubenswrapper[4781]: W0227 00:05:57.382707 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:57Z is after 2026-02-23T05:33:13Z Feb 27 00:05:57 crc kubenswrapper[4781]: E0227 00:05:57.382818 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:58 crc kubenswrapper[4781]: W0227 00:05:58.143134 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z Feb 27 00:05:58 crc kubenswrapper[4781]: E0227 00:05:58.143286 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:58 crc kubenswrapper[4781]: W0227 00:05:58.258530 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z Feb 27 00:05:58 crc kubenswrapper[4781]: E0227 00:05:58.258672 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.263200 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z Feb 27 00:05:58 crc kubenswrapper[4781]: E0227 00:05:58.545154 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.546300 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.547569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.547656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.547676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:05:58 crc kubenswrapper[4781]: I0227 00:05:58.547713 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:05:58 crc kubenswrapper[4781]: E0227 00:05:58.552662 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:58Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:05:59 crc kubenswrapper[4781]: I0227 00:05:59.256985 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:05:59Z is after 2026-02-23T05:33:13Z Feb 27 00:06:00 crc kubenswrapper[4781]: I0227 00:06:00.258445 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:00Z is after 2026-02-23T05:33:13Z Feb 27 00:06:01 crc kubenswrapper[4781]: I0227 00:06:01.256248 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:01Z is after 2026-02-23T05:33:13Z Feb 27 00:06:01 crc kubenswrapper[4781]: E0227 00:06:01.385880 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:02 crc kubenswrapper[4781]: I0227 00:06:02.257651 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.257976 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:03Z is after 2026-02-23T05:33:13Z Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.610797 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.610879 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.610986 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.611179 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.616952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.617998 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 27 00:06:03 crc kubenswrapper[4781]: I0227 00:06:03.618271 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a" gracePeriod=30 Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.255911 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:04Z is after 2026-02-23T05:33:13Z Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.463894 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464589 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a" exitCode=255 Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a"} Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464706 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a"} Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.464828 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:04 crc kubenswrapper[4781]: I0227 00:06:04.466088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.146621 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.257308 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.550524 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.552807 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:05 crc kubenswrapper[4781]: I0227 00:06:05.554587 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:05 crc kubenswrapper[4781]: E0227 00:06:05.559314 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:05Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.258669 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:06Z is after 2026-02-23T05:33:13Z Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.759702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.759913 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:06 crc kubenswrapper[4781]: I0227 00:06:06.761721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:07 crc kubenswrapper[4781]: I0227 00:06:07.258620 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:07Z is after 2026-02-23T05:33:13Z Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.258603 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:08Z is after 2026-02-23T05:33:13Z Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.308670 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.310315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:08 crc kubenswrapper[4781]: I0227 00:06:08.311180 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.257816 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:09Z is after 2026-02-23T05:33:13Z Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.480742 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.484050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f"} Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.484269 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:09 crc kubenswrapper[4781]: I0227 00:06:09.485387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.257881 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:10Z is after 2026-02-23T05:33:13Z Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.489823 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.490583 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493737 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" exitCode=255 Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f"} Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.493845 4781 scope.go:117] "RemoveContainer" containerID="2c8810f20fd274cb743f419a437c4cab58c3ca4bf18ab25919217a2e2cb4c3b1" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.494074 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496228 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.496248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.497256 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.497687 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:10 crc kubenswrapper[4781]: I0227 00:06:10.827588 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.833688 4781 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:10 crc kubenswrapper[4781]: E0227 00:06:10.835005 4781 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 27 00:06:11 crc kubenswrapper[4781]: I0227 00:06:11.258489 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:11Z is after 2026-02-23T05:33:13Z Feb 27 00:06:11 crc kubenswrapper[4781]: E0227 00:06:11.386170 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:11 crc kubenswrapper[4781]: I0227 00:06:11.499755 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:12 crc kubenswrapper[4781]: W0227 00:06:12.215852 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.216376 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.258572 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.555801 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.559909 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:12 crc kubenswrapper[4781]: I0227 00:06:12.561222 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:12 crc kubenswrapper[4781]: E0227 00:06:12.566292 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:12Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.257614 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.282155 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.282398 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.283419 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.283693 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.284907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.285002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.285031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:13 crc kubenswrapper[4781]: I0227 00:06:13.286003 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:13 crc kubenswrapper[4781]: E0227 00:06:13.286426 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:13 crc kubenswrapper[4781]: W0227 00:06:13.764066 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z Feb 27 00:06:13 crc kubenswrapper[4781]: E0227 00:06:13.765054 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 00:06:14 crc kubenswrapper[4781]: I0227 00:06:14.257469 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:14Z is after 2026-02-23T05:33:13Z Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.154047 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b368abff74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,LastTimestamp:2026-02-27 00:05:31.251122036 +0000 UTC m=+0.508661590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.160726 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.167200 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.174115 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.180207 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b37067555d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.380839773 +0000 UTC m=+0.638379327,LastTimestamp:2026-02-27 00:05:31.380839773 +0000 UTC m=+0.638379327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.186726 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.409728921 +0000 UTC m=+0.667268465,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.192786 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.409745722 +0000 UTC m=+0.667285276,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.199240 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.409753442 +0000 UTC m=+0.667292996,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.205035 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.410482951 +0000 UTC m=+0.668022505,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.212714 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.410500902 +0000 UTC m=+0.668040456,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.216729 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.410510492 +0000 UTC m=+0.668050046,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.220958 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411119779 +0000 UTC m=+0.668659343,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.225162 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411134939 +0000 UTC m=+0.668674513,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.230617 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411146569 +0000 UTC m=+0.668686133,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.233756 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411623492 +0000 UTC m=+0.669163036,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.237817 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411663813 +0000 UTC m=+0.669203367,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.241686 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411672434 +0000 UTC m=+0.669211988,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.245727 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.411751676 +0000 UTC m=+0.669291230,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.250331 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.411768796 +0000 UTC m=+0.669308350,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.253820 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.411777586 +0000 UTC m=+0.669317140,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: I0227 00:06:15.254473 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.259542 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.412463035 +0000 UTC m=+0.670002589,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.265818 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.412481435 +0000 UTC m=+0.670020989,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.269662 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac3cc7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac3cc7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301469383 +0000 UTC m=+0.559008927,LastTimestamp:2026-02-27 00:05:31.412489926 +0000 UTC m=+0.670029480,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.273493 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36babef97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36babef97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301449623 +0000 UTC m=+0.558989177,LastTimestamp:2026-02-27 00:05:31.412521297 +0000 UTC m=+0.670060851,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.280416 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f1b36bac1e09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f1b36bac1e09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.301461513 +0000 UTC m=+0.559001057,LastTimestamp:2026-02-27 00:05:31.412530517 +0000 UTC m=+0.670070071,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.286454 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b388dd45a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.791222182 +0000 UTC m=+1.048761756,LastTimestamp:2026-02-27 00:05:31.791222182 +0000 UTC m=+1.048761756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.292952 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b388de8923 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.791304995 +0000 UTC m=+1.048844579,LastTimestamp:2026-02-27 00:05:31.791304995 +0000 UTC m=+1.048844579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.298596 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3890d2d3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.794361658 +0000 UTC m=+1.051901262,LastTimestamp:2026-02-27 00:05:31.794361658 +0000 UTC m=+1.051901262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.304617 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b389501661 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.798746721 +0000 UTC m=+1.056286315,LastTimestamp:2026-02-27 00:05:31.798746721 +0000 UTC m=+1.056286315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.309331 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b389e84946 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:31.808721222 +0000 UTC m=+1.066260776,LastTimestamp:2026-02-27 00:05:31.808721222 +0000 UTC m=+1.066260776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.313811 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3a9ceeb1f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.343929631 +0000 UTC m=+1.601469185,LastTimestamp:2026-02-27 00:05:32.343929631 +0000 UTC m=+1.601469185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.320421 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3a9d04ec2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344020674 +0000 UTC m=+1.601560228,LastTimestamp:2026-02-27 00:05:32.344020674 +0000 UTC m=+1.601560228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.324717 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3a9dac146 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.34470535 +0000 UTC m=+1.602244904,LastTimestamp:2026-02-27 00:05:32.34470535 +0000 UTC m=+1.602244904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.328671 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3a9db2eed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344733421 +0000 UTC m=+1.602272975,LastTimestamp:2026-02-27 00:05:32.344733421 +0000 UTC m=+1.602272975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.332367 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3a9dbe38f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.344779663 +0000 UTC m=+1.602319217,LastTimestamp:2026-02-27 00:05:32.344779663 +0000 UTC m=+1.602319217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.337139 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa59d999 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.353034649 +0000 UTC m=+1.610574203,LastTimestamp:2026-02-27 00:05:32.353034649 +0000 UTC m=+1.610574203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.338423 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa6fbe11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,LastTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.343401 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3aa95f3cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.356973516 +0000 UTC m=+1.614513060,LastTimestamp:2026-02-27 00:05:32.356973516 +0000 UTC m=+1.614513060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.345522 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3aab2dd3e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.358868286 +0000 UTC m=+1.616407840,LastTimestamp:2026-02-27 00:05:32.358868286 +0000 UTC m=+1.616407840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.350266 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3aac10b24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.35979754 +0000 UTC m=+1.617337094,LastTimestamp:2026-02-27 00:05:32.35979754 +0000 UTC m=+1.617337094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.357425 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3aac2bba9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.359908265 +0000 UTC m=+1.617447819,LastTimestamp:2026-02-27 00:05:32.359908265 +0000 UTC m=+1.617447819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.364305 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3ba40187b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,LastTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.372081 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bae8e7cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,LastTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.381508 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bb01792c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.632455468 +0000 UTC m=+1.889995032,LastTimestamp:2026-02-27 00:05:32.632455468 +0000 UTC m=+1.889995032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.389017 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c793112a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.84332369 +0000 UTC m=+2.100863264,LastTimestamp:2026-02-27 00:05:32.84332369 +0000 UTC m=+2.100863264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.395974 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c856d6ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.856153836 +0000 UTC m=+2.113693410,LastTimestamp:2026-02-27 00:05:32.856153836 +0000 UTC m=+2.113693410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.402901 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3c86713e0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.857218016 +0000 UTC m=+2.114757580,LastTimestamp:2026-02-27 00:05:32.857218016 +0000 UTC m=+2.114757580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.409867 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3d3d2fa7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.048838783 +0000 UTC m=+2.306378337,LastTimestamp:2026-02-27 00:05:33.048838783 +0000 UTC m=+2.306378337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.417248 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3d4d51145 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.065752901 +0000 UTC m=+2.323292495,LastTimestamp:2026-02-27 00:05:33.065752901 +0000 UTC m=+2.323292495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.425854 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3e44be5b0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.325198768 +0000 UTC m=+2.582738322,LastTimestamp:2026-02-27 00:05:33.325198768 +0000 UTC m=+2.582738322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.432742 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3e467cf76 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.327028086 +0000 UTC m=+2.584567640,LastTimestamp:2026-02-27 00:05:33.327028086 +0000 UTC m=+2.584567640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.441009 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3e4d9e564 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.334504804 +0000 UTC m=+2.592044378,LastTimestamp:2026-02-27 00:05:33.334504804 +0000 UTC m=+2.592044378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.447745 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3e4e61f1d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.335306013 +0000 UTC m=+2.592845577,LastTimestamp:2026-02-27 00:05:33.335306013 +0000 UTC m=+2.592845577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.455423 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f1452e4d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.542862413 +0000 UTC m=+2.800401967,LastTimestamp:2026-02-27 00:05:33.542862413 +0000 UTC m=+2.800401967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.461971 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3f154b4aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.54387985 +0000 UTC m=+2.801419404,LastTimestamp:2026-02-27 00:05:33.54387985 +0000 UTC m=+2.801419404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.466363 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3f15b1bca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.544299466 +0000 UTC m=+2.801839020,LastTimestamp:2026-02-27 00:05:33.544299466 +0000 UTC m=+2.801839020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.472162 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f22d2e13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.558066707 +0000 UTC m=+2.815606271,LastTimestamp:2026-02-27 00:05:33.558066707 +0000 UTC m=+2.815606271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.478447 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f27e2482 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.563372674 +0000 UTC m=+2.820912238,LastTimestamp:2026-02-27 00:05:33.563372674 +0000 UTC m=+2.820912238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.484660 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3f28a9169 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564186985 +0000 UTC m=+2.821726539,LastTimestamp:2026-02-27 00:05:33.564186985 +0000 UTC m=+2.821726539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.490825 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b3f2917949 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564639561 +0000 UTC m=+2.822179125,LastTimestamp:2026-02-27 00:05:33.564639561 +0000 UTC m=+2.822179125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.497748 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f1b3f292160d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.564679693 +0000 UTC m=+2.822219247,LastTimestamp:2026-02-27 00:05:33.564679693 +0000 UTC m=+2.822219247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.503147 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f36fcb47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.579209543 +0000 UTC m=+2.836749097,LastTimestamp:2026-02-27 00:05:33.579209543 +0000 UTC m=+2.836749097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.507535 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3f37c9b8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.580049294 +0000 UTC m=+2.837588848,LastTimestamp:2026-02-27 00:05:33.580049294 +0000 UTC m=+2.837588848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.512559 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fd529034 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.745066036 +0000 UTC m=+3.002605590,LastTimestamp:2026-02-27 00:05:33.745066036 +0000 UTC m=+3.002605590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.517570 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fd5dca22 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.745801762 +0000 UTC m=+3.003341316,LastTimestamp:2026-02-27 00:05:33.745801762 +0000 UTC m=+3.003341316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.523415 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fe07df6d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.756948333 +0000 UTC m=+3.014487897,LastTimestamp:2026-02-27 00:05:33.756948333 +0000 UTC m=+3.014487897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.529781 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b3fe25438f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.758874511 +0000 UTC m=+3.016414065,LastTimestamp:2026-02-27 00:05:33.758874511 +0000 UTC m=+3.016414065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.535623 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fe2b5d55 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.759274325 +0000 UTC m=+3.016813879,LastTimestamp:2026-02-27 00:05:33.759274325 +0000 UTC m=+3.016813879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.542569 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b3fe4638cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.761034447 +0000 UTC m=+3.018574001,LastTimestamp:2026-02-27 00:05:33.761034447 +0000 UTC m=+3.018574001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.548098 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b40ad10aee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.971458798 +0000 UTC m=+3.228998352,LastTimestamp:2026-02-27 00:05:33.971458798 +0000 UTC m=+3.228998352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.554578 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40ae7b516 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.97294415 +0000 UTC m=+3.230483704,LastTimestamp:2026-02-27 00:05:33.97294415 +0000 UTC m=+3.230483704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.560370 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f1b40bbcd71f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.986912031 +0000 UTC m=+3.244451595,LastTimestamp:2026-02-27 00:05:33.986912031 +0000 UTC m=+3.244451595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.567421 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40c0662d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.99173192 +0000 UTC m=+3.249271474,LastTimestamp:2026-02-27 00:05:33.99173192 +0000 UTC m=+3.249271474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.575081 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b40c1bc877 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:33.993134199 +0000 UTC m=+3.250673753,LastTimestamp:2026-02-27 00:05:33.993134199 +0000 UTC m=+3.250673753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.585868 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b4151132f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.143435509 +0000 UTC m=+3.400975103,LastTimestamp:2026-02-27 00:05:34.143435509 +0000 UTC m=+3.400975103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.593135 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415a3d83c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153046076 +0000 UTC m=+3.410585630,LastTimestamp:2026-02-27 00:05:34.153046076 +0000 UTC m=+3.410585630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.600517 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415b065dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,LastTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.607267 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b420a61040 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,LastTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.615912 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4212cd882 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.346573954 +0000 UTC m=+3.604113518,LastTimestamp:2026-02-27 00:05:34.346573954 +0000 UTC m=+3.604113518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.621479 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b42140a529 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,LastTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.627058 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b42d170f33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.546472755 +0000 UTC m=+3.804012309,LastTimestamp:2026-02-27 00:05:34.546472755 +0000 UTC m=+3.804012309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.631868 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b42df4c946 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.561003846 +0000 UTC m=+3.818543400,LastTimestamp:2026-02-27 00:05:34.561003846 +0000 UTC m=+3.818543400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.641665 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b45dc59444 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.363216452 +0000 UTC m=+4.620756046,LastTimestamp:2026-02-27 00:05:35.363216452 +0000 UTC m=+4.620756046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.648565 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46a41eeaa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.57269265 +0000 UTC m=+4.830232214,LastTimestamp:2026-02-27 00:05:35.57269265 +0000 UTC m=+4.830232214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.659041 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46ac45b7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.58124019 +0000 UTC m=+4.838779754,LastTimestamp:2026-02-27 00:05:35.58124019 +0000 UTC m=+4.838779754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.660727 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b46ad6d0a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.582449833 +0000 UTC m=+4.839989397,LastTimestamp:2026-02-27 00:05:35.582449833 +0000 UTC m=+4.839989397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.666824 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47a3bb42a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.840719914 +0000 UTC m=+5.098259518,LastTimestamp:2026-02-27 00:05:35.840719914 +0000 UTC m=+5.098259518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.675904 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47b4aeb83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.858494339 +0000 UTC m=+5.116033943,LastTimestamp:2026-02-27 00:05:35.858494339 +0000 UTC m=+5.116033943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.681370 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b47b63a5da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:35.860114906 +0000 UTC m=+5.117654500,LastTimestamp:2026-02-27 00:05:35.860114906 +0000 UTC m=+5.117654500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.685504 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b489a0a2ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.098992876 +0000 UTC m=+5.356532430,LastTimestamp:2026-02-27 00:05:36.098992876 +0000 UTC m=+5.356532430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.690636 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b48a571d6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.110951786 +0000 UTC m=+5.368491340,LastTimestamp:2026-02-27 00:05:36.110951786 +0000 UTC m=+5.368491340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.697994 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b48a66dab2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.111983282 +0000 UTC m=+5.369522876,LastTimestamp:2026-02-27 00:05:36.111983282 +0000 UTC m=+5.369522876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.703003 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b49767c3e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.330146786 +0000 UTC m=+5.587686350,LastTimestamp:2026-02-27 00:05:36.330146786 +0000 UTC m=+5.587686350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.707918 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b498649389 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.346715017 +0000 UTC m=+5.604254591,LastTimestamp:2026-02-27 00:05:36.346715017 +0000 UTC m=+5.604254591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.712526 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b49870965f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.347502175 +0000 UTC m=+5.605041739,LastTimestamp:2026-02-27 00:05:36.347502175 +0000 UTC m=+5.605041739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.716559 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4a488ac9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.550407322 +0000 UTC m=+5.807946906,LastTimestamp:2026-02-27 00:05:36.550407322 +0000 UTC m=+5.807946906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.722900 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f1b4a5479826 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:36.562919462 +0000 UTC m=+5.820459056,LastTimestamp:2026-02-27 00:05:36.562919462 +0000 UTC m=+5.820459056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.730910 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b6e421 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 00:06:15 crc kubenswrapper[4781]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:06:15 crc kubenswrapper[4781]: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,LastTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.737100 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b9c006 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,LastTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.741773 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b6a4b6e421\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b6e421 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 00:06:15 crc kubenswrapper[4781]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 00:06:15 crc kubenswrapper[4781]: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.143370785 +0000 UTC m=+14.400910369,LastTimestamp:2026-02-27 00:05:45.148316555 +0000 UTC m=+14.405856149,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.746552 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b6a4b9c006\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b6a4b9c006 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:45.14355815 +0000 UTC m=+14.401097744,LastTimestamp:2026-02-27 00:05:45.14851044 +0000 UTC m=+14.406050034,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.752409 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.757528 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.762152 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b415b065dd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b415b065dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.153868765 +0000 UTC m=+3.411408319,LastTimestamp:2026-02-27 00:05:46.4018509 +0000 UTC m=+15.659390494,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.765736 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b420a61040\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b420a61040 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.337740864 +0000 UTC m=+3.595280418,LastTimestamp:2026-02-27 00:05:46.584296783 +0000 UTC m=+15.841836347,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.766857 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f1b42140a529\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f1b42140a529 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:34.347871529 +0000 UTC m=+3.605411083,LastTimestamp:2026-02-27 00:05:46.596296309 +0000 UTC m=+15.853835883,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.774359 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b538a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:05:56.284743507 +0000 UTC m=+25.542283091,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.778735 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b5caef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:05:56.284801488 +0000 UTC m=+25.542341072,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.784255 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:15 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf1765d16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer Feb 27 00:06:15 crc kubenswrapper[4781]: body: Feb 27 00:06:15 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.610856726 +0000 UTC m=+32.868396320,LastTimestamp:2026-02-27 00:06:03.610856726 +0000 UTC m=+32.868396320,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:15 crc kubenswrapper[4781]: > Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.788588 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf177bec4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41686->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.610947268 +0000 UTC m=+32.868486852,LastTimestamp:2026-02-27 00:06:03.610947268 +0000 UTC m=+32.868486852,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.794489 4781 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1baf1e71776 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:06:03.61824447 +0000 UTC m=+32.875784064,LastTimestamp:2026-02-27 00:06:03.61824447 +0000 UTC m=+32.875784064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.799278 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3aa6fbe11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3aa6fbe11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.354469393 +0000 UTC m=+1.612008947,LastTimestamp:2026-02-27 00:06:03.631720944 +0000 UTC m=+32.889260538,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.803525 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3ba40187b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3ba40187b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.619782267 +0000 UTC m=+1.877321821,LastTimestamp:2026-02-27 00:06:03.836852833 +0000 UTC m=+33.094392427,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:15 crc kubenswrapper[4781]: E0227 00:06:15.808508 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b3bae8e7cc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b3bae8e7cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:32.630845388 +0000 UTC m=+1.888384942,LastTimestamp:2026-02-27 00:06:03.847694718 +0000 UTC m=+33.105234272,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.260807 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.284261 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:06:16 crc kubenswrapper[4781]: I0227 00:06:16.284485 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:06:16 crc kubenswrapper[4781]: E0227 00:06:16.290037 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b538a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 00:06:16 crc kubenswrapper[4781]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b538a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 00:06:16 crc kubenswrapper[4781]: body: Feb 27 00:06:16 crc kubenswrapper[4781]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284112036 +0000 UTC m=+15.541651590,LastTimestamp:2026-02-27 00:06:16.28443244 +0000 UTC m=+45.541972034,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 00:06:16 crc kubenswrapper[4781]: > Feb 27 00:06:16 crc kubenswrapper[4781]: E0227 00:06:16.296964 4781 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f1b6e8b5caef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f1b6e8b5caef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:05:46.284149487 +0000 UTC m=+15.541689041,LastTimestamp:2026-02-27 00:06:16.284534943 +0000 UTC m=+45.542074527,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:06:17 crc kubenswrapper[4781]: I0227 00:06:17.257972 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.259120 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: W0227 00:06:18.502034 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 27 00:06:18 crc kubenswrapper[4781]: E0227 00:06:18.502118 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.704983 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.705222 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.706743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:18 crc kubenswrapper[4781]: I0227 00:06:18.707487 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:18 crc kubenswrapper[4781]: E0227 00:06:18.707822 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.260025 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:19 crc kubenswrapper[4781]: E0227 00:06:19.565945 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.566966 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:19 crc kubenswrapper[4781]: I0227 00:06:19.568591 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:19 crc kubenswrapper[4781]: E0227 00:06:19.575698 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:20 crc kubenswrapper[4781]: I0227 00:06:20.260038 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: W0227 00:06:21.004232 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: E0227 00:06:21.004312 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 00:06:21 crc kubenswrapper[4781]: I0227 00:06:21.257932 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:21 crc kubenswrapper[4781]: E0227 00:06:21.386829 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:22 crc kubenswrapper[4781]: I0227 00:06:22.257440 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:23 crc kubenswrapper[4781]: I0227 00:06:23.258534 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:24 crc kubenswrapper[4781]: I0227 00:06:24.258497 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.259517 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.274456 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.274888 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.276051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.583079 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.583359 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.585574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:25 crc kubenswrapper[4781]: I0227 00:06:25.587337 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.256556 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.542569 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.543374 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:26 crc kubenswrapper[4781]: E0227 00:06:26.570383 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.576558 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:26 crc kubenswrapper[4781]: I0227 00:06:26.577443 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:26 crc kubenswrapper[4781]: E0227 00:06:26.580447 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:27 crc kubenswrapper[4781]: I0227 00:06:27.259062 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:28 crc kubenswrapper[4781]: I0227 00:06:28.257482 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:29 crc kubenswrapper[4781]: I0227 00:06:29.260492 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:30 crc kubenswrapper[4781]: I0227 00:06:30.291854 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:31 crc kubenswrapper[4781]: I0227 00:06:31.258512 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:31 crc kubenswrapper[4781]: E0227 00:06:31.387675 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:32 crc kubenswrapper[4781]: I0227 00:06:32.260885 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.258009 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.308890 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.310769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.311596 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.563425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.565695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5"} Feb 27 00:06:33 crc kubenswrapper[4781]: E0227 00:06:33.577158 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581067 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581929 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:33 crc kubenswrapper[4781]: I0227 00:06:33.581969 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:33 crc kubenswrapper[4781]: E0227 00:06:33.585392 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.267573 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.571175 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.571992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574207 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" exitCode=255 Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5"} Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574294 4781 scope.go:117] "RemoveContainer" containerID="9a88178cd0279fc812438f2f0bdbf17e596c3b8da7b7acc17b661c15e3e2f06f" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.574453 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575610 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.575673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:34 crc kubenswrapper[4781]: I0227 00:06:34.576136 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:34 crc kubenswrapper[4781]: E0227 00:06:34.576319 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.257502 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.579317 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.582475 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.583989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:35 crc kubenswrapper[4781]: I0227 00:06:35.584499 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:35 crc kubenswrapper[4781]: E0227 00:06:35.584706 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:36 crc kubenswrapper[4781]: I0227 00:06:36.260530 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:37 crc kubenswrapper[4781]: I0227 00:06:37.258414 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.257305 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.704901 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.706043 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707849 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.707867 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:38 crc kubenswrapper[4781]: I0227 00:06:38.708730 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:38 crc kubenswrapper[4781]: E0227 00:06:38.709010 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:39 crc kubenswrapper[4781]: I0227 00:06:39.261229 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.258096 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:40 crc kubenswrapper[4781]: E0227 00:06:40.578505 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.585733 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:40 crc kubenswrapper[4781]: I0227 00:06:40.586924 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:40 crc kubenswrapper[4781]: E0227 00:06:40.590503 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 00:06:41 crc kubenswrapper[4781]: I0227 00:06:41.260400 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:41 crc kubenswrapper[4781]: E0227 00:06:41.388582 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.260327 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.837024 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 00:06:42 crc kubenswrapper[4781]: I0227 00:06:42.849549 4781 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.260141 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.282490 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.282816 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.284209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:43 crc kubenswrapper[4781]: I0227 00:06:43.285009 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:43 crc kubenswrapper[4781]: E0227 00:06:43.285266 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:44 crc kubenswrapper[4781]: I0227 00:06:44.258370 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:45 crc kubenswrapper[4781]: I0227 00:06:45.259701 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.160920 4781 csr.go:261] certificate signing request csr-dbkk9 is approved, waiting to be issued Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.169778 4781 csr.go:257] certificate signing request csr-dbkk9 is issued Feb 27 00:06:46 crc kubenswrapper[4781]: I0227 00:06:46.252314 4781 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.127183 4781 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.171179 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-07 02:32:55.977923198 +0000 UTC Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.171230 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6794h26m8.80669634s for next certificate rotation Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.590669 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.592847 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.600996 4781 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.601080 4781 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.601094 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.606519 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.618135 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625926 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.625945 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.637211 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.643311 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.653802 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:47 crc kubenswrapper[4781]: I0227 00:06:47.662689 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:47Z","lastTransitionTime":"2026-02-27T00:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671602 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671843 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.671868 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.772775 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.873908 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:47 crc kubenswrapper[4781]: E0227 00:06:47.974374 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.075461 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.175877 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.276688 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.376978 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.477243 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.577552 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.677666 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.778388 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.879284 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:48 crc kubenswrapper[4781]: E0227 00:06:48.979516 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.079702 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.180758 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.281499 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.382711 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.483095 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.583506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.684602 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.784922 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.886085 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:49 crc kubenswrapper[4781]: E0227 00:06:49.987090 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.088112 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.188870 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.289588 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.390543 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.490719 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.591523 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.692169 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.792767 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: I0227 00:06:50.872867 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.893037 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:50 crc kubenswrapper[4781]: E0227 00:06:50.993126 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.093320 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.193775 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.294696 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.388731 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.394807 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.495673 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.596991 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.697289 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.800326 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:51 crc kubenswrapper[4781]: E0227 00:06:51.901416 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.001964 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.102913 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.203452 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.303578 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.404271 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.505321 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.606392 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.707341 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.808402 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:52 crc kubenswrapper[4781]: E0227 00:06:52.908966 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.010059 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.111165 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.212064 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.313058 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.414176 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.514531 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.615707 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.715829 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.816942 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:53 crc kubenswrapper[4781]: E0227 00:06:53.918314 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.018506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.119551 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.219989 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.320673 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.421757 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.522711 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.622800 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.723930 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.824616 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:54 crc kubenswrapper[4781]: E0227 00:06:54.925706 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.026061 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.127017 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.228096 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.308452 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.309980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.310179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.310389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:55 crc kubenswrapper[4781]: I0227 00:06:55.311511 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.312014 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.328290 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.428902 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.529266 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.629696 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.730567 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.831357 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:55 crc kubenswrapper[4781]: E0227 00:06:55.931651 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.032812 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.133433 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.233874 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.334372 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: I0227 00:06:56.415240 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.435184 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.536101 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.636286 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.736822 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.837586 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:56 crc kubenswrapper[4781]: E0227 00:06:56.937998 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.039032 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.139533 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.239915 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.340352 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.440744 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.541683 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.642506 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.742936 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.843813 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:57 crc kubenswrapper[4781]: E0227 00:06:57.944889 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.017927 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.023929 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.038716 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043811 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.043951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.044078 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.059808 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.064566 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.081460 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:06:58 crc kubenswrapper[4781]: I0227 00:06:58.086808 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:06:58Z","lastTransitionTime":"2026-02-27T00:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100193 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100410 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.100447 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.201186 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.301703 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.403315 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.504002 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.604987 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.705162 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.806337 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:58 crc kubenswrapper[4781]: E0227 00:06:58.907712 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.009068 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.109879 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.210394 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.308770 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.312554 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:06:59 crc kubenswrapper[4781]: I0227 00:06:59.313824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.413361 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.514347 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.615069 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.715946 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.817136 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:06:59 crc kubenswrapper[4781]: E0227 00:06:59.918093 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.019031 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.119595 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.219984 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.320763 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.421127 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.521221 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.622357 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.722772 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.823831 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:00 crc kubenswrapper[4781]: E0227 00:07:00.924962 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.025070 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.125717 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.225995 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.326923 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.389604 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.427576 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.527814 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.627966 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.728540 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.829706 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:01 crc kubenswrapper[4781]: E0227 00:07:01.930741 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.031100 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.132093 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.233202 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.333538 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.433930 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.534864 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.635774 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.736253 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.836575 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:02 crc kubenswrapper[4781]: E0227 00:07:02.937132 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.038246 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.138347 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.239502 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.340041 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.441025 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.541158 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.642011 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.742113 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: I0227 00:07:03.750489 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.842721 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:03 crc kubenswrapper[4781]: E0227 00:07:03.943805 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.044091 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.145264 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.245816 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.347040 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.448100 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.549014 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.649924 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.750299 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.851381 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:04 crc kubenswrapper[4781]: E0227 00:07:04.952342 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.052819 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.153297 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.253789 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.354670 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.454957 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.555601 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.655916 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.756041 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.857124 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:05 crc kubenswrapper[4781]: E0227 00:07:05.957829 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.059009 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.159878 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.260394 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.361231 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.461786 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.561970 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.662672 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.763514 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.864416 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:06 crc kubenswrapper[4781]: E0227 00:07:06.964679 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.065389 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.165689 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.266606 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.367580 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.467903 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.568123 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.669230 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.770307 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.871132 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:07 crc kubenswrapper[4781]: E0227 00:07:07.971960 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.072586 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.172883 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.273430 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.373839 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.447433 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.452354 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.463225 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.467443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.481857 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.486417 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.506369 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.511836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:08 crc kubenswrapper[4781]: I0227 00:07:08.512168 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:08Z","lastTransitionTime":"2026-02-27T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527291 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527519 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.527574 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.627977 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.728821 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.829864 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:08 crc kubenswrapper[4781]: E0227 00:07:08.930388 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.030559 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.131257 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.231408 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.309197 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.310685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.311658 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.311940 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.332448 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.433131 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.534229 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.635314 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.735653 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: E0227 00:07:09.836604 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.842778 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939287 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:09 crc kubenswrapper[4781]: I0227 00:07:09.939308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:09Z","lastTransitionTime":"2026-02-27T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.042884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.043512 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145922 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.145938 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.249681 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.310830 4781 apiserver.go:52] "Watching apiserver" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.316135 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.316511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317152 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.317523 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.317600 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.317677 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.320701 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.321431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.321566 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.322432 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.322803 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323280 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323442 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323763 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323773 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.323908 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.324122 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.324780 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.351999 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.355028 4781 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.361918 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.376861 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.392071 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406245 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406340 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406529 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406561 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406592 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406622 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406714 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406746 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406880 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.406958 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407200 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407283 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407317 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407402 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407489 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407528 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407559 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407661 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407776 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407882 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407916 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407982 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408012 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409281 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409333 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409369 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409514 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409871 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409938 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410080 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407586 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.407797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408046 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410257 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410329 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410488 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410562 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410754 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410806 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410919 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411023 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411086 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411118 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411190 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411226 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411394 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411460 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411498 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411775 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411960 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412014 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412058 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412203 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412256 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412302 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412350 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412397 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412488 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412523 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412625 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412710 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412783 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412817 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412856 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412889 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412920 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412955 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413140 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413207 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413275 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413345 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413379 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413478 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413817 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413923 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413957 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414063 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414097 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414132 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414164 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414198 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414276 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414348 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414383 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414418 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414556 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414595 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414729 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414825 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415005 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415300 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415336 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415373 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415517 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415553 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415804 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425925 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.426684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427015 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427417 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427545 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.427696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.428465 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.429111 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422006 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.429850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409409 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409854 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.409874 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410674 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.410878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.411465 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412670 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437455 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.412727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413117 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.413700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.414578 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415315 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415442 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.415732 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420735 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.420774 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.420918 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421061 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421311 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.421376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422197 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.422323 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423597 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.423788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424620 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424941 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.424938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425202 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425303 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425756 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.425857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430189 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430275 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430507 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.430999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431137 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431653 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431836 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.431935 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432250 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.408869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432396 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.432408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433301 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433505 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.433634 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434005 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434514 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434663 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.434903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.435599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.436124 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.436903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437087 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437102 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.437199 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438587 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.438805 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439345 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439425 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.440483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.439598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.440881 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.442517 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.443248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.444508 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.441704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.444797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446369 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446597 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.446307 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447230 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.447700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448774 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448132 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448602 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.448889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449041 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449429 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.449816 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.449821 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.450970 4781 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.451399 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.452463 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.952413764 +0000 UTC m=+100.209953368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.452542 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.952520277 +0000 UTC m=+100.210059871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.452877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.453145 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.953101102 +0000 UTC m=+100.210640736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453292 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453585 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453620 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453693 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453726 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453755 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453782 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453808 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453830 4781 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453850 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453868 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453886 4781 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453905 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453924 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453943 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453961 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453979 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.453998 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454015 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454034 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454051 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454070 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454089 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454107 4781 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454126 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454145 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454163 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454180 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454197 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454215 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454233 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454251 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454269 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454287 4781 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454307 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454745 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454998 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.457231 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.458902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.462307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.464542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.468004 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.468901 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.469379 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472385 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472508 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472595 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.472759 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.97272514 +0000 UTC m=+100.230264774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.472546 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.454325 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473804 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473892 4781 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473975 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474050 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474132 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474209 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474293 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474366 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474442 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474515 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474593 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474688 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474779 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474880 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.474958 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475038 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475113 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475218 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475309 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475389 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475469 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475590 4781 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475694 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475812 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475906 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.475999 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476084 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476168 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476245 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476345 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476436 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476511 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476582 4781 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476683 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476771 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.476856 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477364 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477481 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477571 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477682 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477784 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.477951 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478149 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478272 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478426 4781 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478530 4781 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478669 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478775 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478866 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.478943 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479018 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479107 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479199 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479309 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479425 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479504 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479613 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479742 4781 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.479929 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480027 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480237 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480348 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480434 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480517 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480601 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480704 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.480905 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481580 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481740 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481854 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481960 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482069 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482158 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482245 4781 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482337 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482412 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482490 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482575 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482681 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482773 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482865 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.482958 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483044 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483124 4781 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483210 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483295 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483383 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483467 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483548 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483696 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.481903 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483808 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.483971 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484012 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484033 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484057 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484076 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484095 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484114 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484131 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484148 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484166 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484183 4781 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484201 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484219 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484236 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484253 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484270 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.473811 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.483411 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484288 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484427 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484362 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484492 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484161 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.484223 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.484613 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:10.984574509 +0000 UTC m=+100.242114173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.485869 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486156 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486337 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486457 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.486618 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.487519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488683 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.488750 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.490243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.491575 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.492697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.498610 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499584 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499700 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499944 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.500009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.499591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.501380 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.502030 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.502399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506744 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506947 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.506983 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507490 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.507855 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.508083 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.514406 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.525574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.527834 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.528135 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.544938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562478 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.562548 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585713 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585839 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585956 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.585997 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586023 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586052 4781 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586065 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586077 4781 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586090 4781 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586102 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586113 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586125 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586136 4781 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586150 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586162 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586174 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586185 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586196 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586207 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586218 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586231 4781 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586244 4781 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586256 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586268 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586279 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586291 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586303 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586315 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586327 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586337 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586348 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586359 4781 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586370 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586382 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586393 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586405 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586417 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586440 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586451 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586462 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586473 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586485 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586496 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586507 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586518 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586529 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586539 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586550 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586561 4781 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.586572 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.644789 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.657773 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.665572 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.669426 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.674173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d60cfc6263e7a31b73237d590cab586ca1c5bb0bd1ff189f6a9548d2a24062bc"} Feb 27 00:07:10 crc kubenswrapper[4781]: W0227 00:07:10.682495 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a WatchSource:0}: Error finding container b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a: Status 404 returned error can't find the container with id b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.768775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.769735 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871647 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.871972 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.974650 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:10Z","lastTransitionTime":"2026-02-27T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.989981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990188 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990154026 +0000 UTC m=+101.247693620 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990261 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990281 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990320 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990333 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.99031269 +0000 UTC m=+101.247852324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990340 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990399 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990410 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990387142 +0000 UTC m=+101.247926736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: I0227 00:07:10.990547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990602 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.990580737 +0000 UTC m=+101.248120291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.990622 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991691 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991740 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:10 crc kubenswrapper[4781]: E0227 00:07:10.991796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:11.99178425 +0000 UTC m=+101.249323804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.078457 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180799 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.180828 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.282976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.313412 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.313917 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.315078 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.315667 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.316594 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.317068 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.317641 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.318719 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.319428 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.320366 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.321165 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.322205 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.322674 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.323185 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.324027 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.324521 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.325453 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.325889 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.326443 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.326682 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.327418 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.327986 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.328952 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.329349 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.330326 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.330726 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.331319 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.332325 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.332843 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.333887 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.334353 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.335146 4781 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.335243 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.336774 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.337670 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.338127 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.339678 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.340258 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341132 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341430 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.341778 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.342778 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.343241 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.344273 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.344944 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.345900 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.346336 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.347205 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.347802 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.348859 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.349340 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.350211 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.350722 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.351692 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.352243 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.352717 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.361137 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385900 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.385911 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.410015 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.429856 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.440805 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487849 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.487894 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589835 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.589920 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.679872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b95b44680a9688a92cefef6124c9b7eec60a5565b90b37d2756147ffe3822e5a"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.681885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.683170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"58a2dc55f42c0314911126d5f58434cc542b34d5467d8896fa32c78ba8af47e7"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.692221 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.699928 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.718943 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.736543 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.756755 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.773830 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.792996 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795697 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.795715 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.815296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.834338 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.850649 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.863415 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.879197 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.891704 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898412 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.898443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:11Z","lastTransitionTime":"2026-02-27T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997497 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997562 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997583 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997602 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:11 crc kubenswrapper[4781]: I0227 00:07:11.997619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997786 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997768913 +0000 UTC m=+103.255308457 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997807 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997791 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997859 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997839165 +0000 UTC m=+103.255378719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997859 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997911 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997954 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997970 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997977 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.997947168 +0000 UTC m=+103.255486752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.997870 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998036 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998035 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.99801208 +0000 UTC m=+103.255551634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:11 crc kubenswrapper[4781]: E0227 00:07:11.998100 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:13.998091802 +0000 UTC m=+103.255631356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001440 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.001464 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.103781 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.159644 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d2xt9"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.160101 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.163080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.163859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.166910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.191967 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.199588 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.199850 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.206557 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.217528 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.235940 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.249472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.262496 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.275302 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.286563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.300916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.300987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.301088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4365da31-2d17-4b58-bb27-bd47b5133a8c-hosts-file\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308349 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308376 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308492 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308588 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:12 crc kubenswrapper[4781]: E0227 00:07:12.308714 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.308996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.309098 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.333221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dhwr\" (UniqueName: \"kubernetes.io/projected/4365da31-2d17-4b58-bb27-bd47b5133a8c-kube-api-access-7dhwr\") pod \"node-resolver-d2xt9\" (UID: \"4365da31-2d17-4b58-bb27-bd47b5133a8c\") " pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.411386 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.486124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d2xt9" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.502756 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4365da31_2d17_4b58_bb27_bd47b5133a8c.slice/crio-7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8 WatchSource:0}: Error finding container 7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8: Status 404 returned error can't find the container with id 7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8 Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.512989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.513006 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.550402 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2k4zf"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.551011 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tlstj"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.551249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.553929 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.558180 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-v6fnj"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.559869 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.559988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560103 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560245 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560303 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.560433 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.562914 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.563333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.563934 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564197 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564362 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.564466 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.575870 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.587885 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.597540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608927 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608954 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.608998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609150 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609517 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609603 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609690 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609752 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.609993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610026 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610058 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610084 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610128 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.610157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.613950 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615874 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.615886 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.632362 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.648260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.665563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.676989 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.687779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d2xt9" event={"ID":"4365da31-2d17-4b58-bb27-bd47b5133a8c","Type":"ContainerStarted","Data":"7c906e9e1f1d5d4ed385101cdf2c29ad382c8221dd02b3613e2b41466b7b10c8"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.688556 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.700063 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711056 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711158 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711201 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-bin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711242 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711299 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-system-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711347 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cnibin\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-system-cni-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711372 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711391 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-multus-certs\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711571 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711652 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711803 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711896 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712077 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/32c19e2e-0830-47a5-9ea8-862e1c9d8571-rootfs\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.711461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-os-release\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-daemon-config\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-k8s-cni-cncf-io\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-kubelet\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-socket-dir-parent\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712740 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-etc-kubernetes\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-run-netns\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712901 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-cni-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cni-binary-copy\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.712993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-cnibin\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713016 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-host-var-lib-cni-multus\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-hostroot\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713118 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-multus-conf-dir\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713154 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-os-release\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32c19e2e-0830-47a5-9ea8-862e1c9d8571-mcd-auth-proxy-config\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713856 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-cni-binary-copy\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.713877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2f348e07-ea87-45b6-8f2b-6e1b08eda780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.716265 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32c19e2e-0830-47a5-9ea8-862e1c9d8571-proxy-tls\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.724162 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.725550 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.735105 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98vj\" (UniqueName: \"kubernetes.io/projected/2f348e07-ea87-45b6-8f2b-6e1b08eda780-kube-api-access-x98vj\") pod \"multus-additional-cni-plugins-2k4zf\" (UID: \"2f348e07-ea87-45b6-8f2b-6e1b08eda780\") " pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.737854 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49nxx\" (UniqueName: \"kubernetes.io/projected/9a6dd1e0-45ab-46f0-b298-d89e47aaeecb-kube-api-access-49nxx\") pod \"multus-tlstj\" (UID: \"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\") " pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.740421 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.743832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8st\" (UniqueName: \"kubernetes.io/projected/32c19e2e-0830-47a5-9ea8-862e1c9d8571-kube-api-access-qj8st\") pod \"machine-config-daemon-v6fnj\" (UID: \"32c19e2e-0830-47a5-9ea8-862e1c9d8571\") " pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.759064 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.773648 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.789864 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.808937 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.820795 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.827998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.828072 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.829158 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.841931 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.874833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tlstj" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.885957 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6dd1e0_45ab_46f0_b298_d89e47aaeecb.slice/crio-afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae WatchSource:0}: Error finding container afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae: Status 404 returned error can't find the container with id afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.886772 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.894447 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.894816 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f348e07_ea87_45b6_8f2b_6e1b08eda780.slice/crio-126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05 WatchSource:0}: Error finding container 126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05: Status 404 returned error can't find the container with id 126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05 Feb 27 00:07:12 crc kubenswrapper[4781]: W0227 00:07:12.908140 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c19e2e_0830_47a5_9ea8_862e1c9d8571.slice/crio-49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5 WatchSource:0}: Error finding container 49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5: Status 404 returned error can't find the container with id 49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5 Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.927918 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.928769 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.931780 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932006 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932127 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932278 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932417 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.932637 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933305 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.933522 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:12Z","lastTransitionTime":"2026-02-27T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.950539 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.965008 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.978700 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:12 crc kubenswrapper[4781]: I0227 00:07:12.991646 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:12Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.004592 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014088 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014117 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014229 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014243 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014284 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014300 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014328 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.014404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.017077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.048956 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052919 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.052950 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.076869 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.090905 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.102269 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.112571 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115132 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115217 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.115930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116546 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116648 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116652 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116638 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116692 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116726 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116742 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.116748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117091 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.117187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.122189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.129773 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"ovnkube-node-d2zn6\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.154846 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.245085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257420 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.257496 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: W0227 00:07:13.257697 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12a87c22_b4e1_4aa9_8b3e_a34f7d159239.slice/crio-96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f WatchSource:0}: Error finding container 96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f: Status 404 returned error can't find the container with id 96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.360337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.463116 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.569985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.570011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.570031 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.672617 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.693494 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.693545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"afca0776f8332a8c4a92e2364a60b965769223e7fdd2984f8a337a5359abdfae"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.694938 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" exitCode=0 Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.695002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.695032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698179 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.698247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"49b7bfb8be1ec4f939d5551164b703478eafd97057bba8eaacc08c4bdae4f0a5"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.700402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.702024 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d2xt9" event={"ID":"4365da31-2d17-4b58-bb27-bd47b5133a8c","Type":"ContainerStarted","Data":"a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704677 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8" exitCode=0 Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.704761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"126f52a357f307cfd5409436394d46cd4bf8f894039fb1caf98e2be20ed3ac05"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.712211 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.734234 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.751185 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777431 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.777486 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.779770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.795291 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.813538 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.829258 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.845208 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.872788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879926 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.879962 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.928964 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.944952 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.963172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.981328 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.982571 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:13Z","lastTransitionTime":"2026-02-27T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:13 crc kubenswrapper[4781]: I0227 00:07:13.992297 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.020028 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023245 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023227666 +0000 UTC m=+107.280767220 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023300 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023346949 +0000 UTC m=+107.280886573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.023396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023425 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023425 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023439 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023445 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023448 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023454 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023475 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023469222 +0000 UTC m=+107.281008766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023486 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023481213 +0000 UTC m=+107.281020767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023513 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.023557 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:18.023541464 +0000 UTC m=+107.281081128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.033197 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.049293 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.065363 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.084740 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.085384 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.095817 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.106018 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.118079 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.186848 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.289979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.290098 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.308257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.308398 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.309107 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.309190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.309292 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:14 crc kubenswrapper[4781]: E0227 00:07:14.309371 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.399858 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.400251 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502525 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.502587 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.605982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.606056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.606079 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.713808 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.714136 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716457 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716756 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.716842 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.718993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.733249 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.747947 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.761910 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.778012 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.790973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.810506 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.817746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.821311 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.834597 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.843605 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.854357 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.868288 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:14 crc kubenswrapper[4781]: I0227 00:07:14.920902 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:14Z","lastTransitionTime":"2026-02-27T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.024836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.025004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.025147 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.128451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.128932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.129377 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.231973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.232801 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.334937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.334992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.335045 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437508 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.437717 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.539704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.539939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540195 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.540284 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.642621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.643854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.644360 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.727469 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.727532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.730473 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca" exitCode=0 Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.730520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.748387 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.763081 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.785966 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.801930 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.814402 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.840866 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.851387 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.860904 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.880231 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.895105 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.912952 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.924788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.945415 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:15 crc kubenswrapper[4781]: I0227 00:07:15.954274 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:15Z","lastTransitionTime":"2026-02-27T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.056849 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.159354 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.263436 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309056 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.309188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309160 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309340 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:16 crc kubenswrapper[4781]: E0227 00:07:16.309472 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366355 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.366384 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468675 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.468685 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.571600 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.674565 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.743935 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.744074 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d" exitCode=0 Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.762245 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778875 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.778887 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.779763 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.808102 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.826673 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.842156 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.858886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.875612 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.880970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.880995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.881026 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.897174 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.918374 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.933313 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.948318 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986920 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986929 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:16 crc kubenswrapper[4781]: I0227 00:07:16.986951 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:16Z","lastTransitionTime":"2026-02-27T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.089233 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.191467 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.294120 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397600 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.397701 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.500316 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603419 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.603470 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.706487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.759038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.762841 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e" exitCode=0 Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.762899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.783027 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.797601 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.809297 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.810428 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.830487 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.845772 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.859260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.886971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.903164 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.912876 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:17Z","lastTransitionTime":"2026-02-27T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.916076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.937440 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:17 crc kubenswrapper[4781]: I0227 00:07:17.960609 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.014441 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062743 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062943 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.062908657 +0000 UTC m=+115.320448221 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062956 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.062985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.062999 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063021 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063038 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063045 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063071 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063049331 +0000 UTC m=+115.320588905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063085 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063133 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063145 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063094 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063083501 +0000 UTC m=+115.320623075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063214 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063194994 +0000 UTC m=+115.320734548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.063237 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.063231015 +0000 UTC m=+115.320770559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.117191 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219859 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219913 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219927 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.219958 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309161 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309355 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.309293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309478 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.309620 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322240 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.322355 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.425683 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528920 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528968 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528981 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.528999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.529011 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.633860 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.736398 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.769210 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd" exitCode=0 Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.769250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.793846 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.812849 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.833030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839044 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839091 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839109 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.839151 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.854853 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.871193 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.894290 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.911619 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.929085 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.932071 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.936675 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.946615 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.948134 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951958 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.951986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.952006 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.960091 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.971756 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.974285 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.975856 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:18 crc kubenswrapper[4781]: E0227 00:07:18.992530 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.995425 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:18 crc kubenswrapper[4781]: I0227 00:07:18.996795 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:18Z","lastTransitionTime":"2026-02-27T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: E0227 00:07:19.012828 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: E0227 00:07:19.012980 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014590 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.014656 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.117520 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.223371 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.233096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rc856"] Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.233493 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.240938 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.243176 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.244150 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.244159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.263437 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.283278 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.301935 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.321686 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.327433 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.335062 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.354821 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376284 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.376671 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.390723 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.402029 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.421728 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.431665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.435728 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.447858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477230 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.477422 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-host\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.481985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-serviceca\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.496872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5gj\" (UniqueName: \"kubernetes.io/projected/a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2-kube-api-access-rq5gj\") pod \"node-ca-rc856\" (UID: \"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\") " pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.533747 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.572077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rc856" Feb 27 00:07:19 crc kubenswrapper[4781]: W0227 00:07:19.591420 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60df0eb_b7b5_4b83_8d09_43fcd7c63ab2.slice/crio-254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff WatchSource:0}: Error finding container 254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff: Status 404 returned error can't find the container with id 254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.636899 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739516 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739705 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.739734 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.773151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc856" event={"ID":"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2","Type":"ContainerStarted","Data":"254ee4af5136084e6fbb4d938d96a18bf59daaddcc4f6e83208848cf5ed556ff"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782412 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.782441 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.794470 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f348e07-ea87-45b6-8f2b-6e1b08eda780" containerID="74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245" exitCode=0 Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.794504 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerDied","Data":"74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.800840 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.815618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.833971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.841494 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.842880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.843392 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.849083 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.861114 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.875171 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.889787 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.907698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.920082 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.940499 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944845 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.944880 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:19Z","lastTransitionTime":"2026-02-27T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.953389 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.967872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.982480 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:19 crc kubenswrapper[4781]: I0227 00:07:19.995850 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.007992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.033933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.046761 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.047378 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.056702 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.071712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.087815 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.100332 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.113275 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.126742 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.143200 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149750 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149765 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.149776 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.252308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.308977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309058 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309166 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:20 crc kubenswrapper[4781]: E0227 00:07:20.309463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.325446 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.326145 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359923 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.359943 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.462704 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565367 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.565400 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.668151 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.770706 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.799243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rc856" event={"ID":"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2","Type":"ContainerStarted","Data":"e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.801745 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.803957 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.804555 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.816273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" event={"ID":"2f348e07-ea87-45b6-8f2b-6e1b08eda780","Type":"ContainerStarted","Data":"606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.827057 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.844005 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.855558 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.872749 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.874589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.885111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.897126 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.917512 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.929665 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.952228 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.971933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979191 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.979308 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:20Z","lastTransitionTime":"2026-02-27T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:20 crc kubenswrapper[4781]: I0227 00:07:20.987112 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.001458 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.013691 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.025321 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.040432 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.057834 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.079299 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.088724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.096968 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.112482 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.123251 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.134271 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.145505 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.165885 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.179242 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.192581 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.201443 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.213914 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.295148 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.330304 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.335040 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.354089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.367765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.399394 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.432039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.463074 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.474988 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.488862 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.505927 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.506013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.506126 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.507798 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.521089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.532193 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.541050 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.550160 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.559765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.609496 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712247 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.712257 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.814986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.815003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.815014 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:21 crc kubenswrapper[4781]: I0227 00:07:21.918352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:21Z","lastTransitionTime":"2026-02-27T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020436 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.020450 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123299 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.123327 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.226215 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309215 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.309346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309515 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:22 crc kubenswrapper[4781]: E0227 00:07:22.309691 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.328708 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.431907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.533922 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.533982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534005 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.534058 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.636976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637043 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.637104 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.740370 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.824908 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.829335 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" exitCode=1 Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.829399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.830508 4781 scope.go:117] "RemoveContainer" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.843906 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.862002 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.886604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.906998 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.927868 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.949535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.951567 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:22Z","lastTransitionTime":"2026-02-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:22 crc kubenswrapper[4781]: I0227 00:07:22.968378 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.003507 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.036405 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054792 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.054904 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.086624 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.101957 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.120467 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.140702 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.154677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.157515 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259803 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.259843 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362860 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.362921 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.465332 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.567364 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.669864 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:23 crc kubenswrapper[4781]: I0227 00:07:23.772575 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:23Z","lastTransitionTime":"2026-02-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.285538 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.289263 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.293994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.294518 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.308869 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.308595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.309052 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:24 crc kubenswrapper[4781]: E0227 00:07:24.309261 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.315523 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.333473 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.350255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.369550 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.385198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.391703 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.408431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.435359 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.449802 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.465429 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.488777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.494289 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.505835 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.524776 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.543949 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.559418 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.597476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.597841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.598402 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.702225 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806114 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.806131 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:24 crc kubenswrapper[4781]: I0227 00:07:24.909985 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:24Z","lastTransitionTime":"2026-02-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.013440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.116314 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.219462 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.300364 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.301533 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/0.log" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.306891 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" exitCode=1 Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.306973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.307052 4781 scope.go:117] "RemoveContainer" containerID="cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.308086 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:25 crc kubenswrapper[4781]: E0227 00:07:25.308390 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.322505 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.335098 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.346939 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s"] Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.347668 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.350460 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.350462 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.361136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.381250 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.399349 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.416735 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.425573 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.433367 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454597 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454661 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.454762 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.465709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.485426 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.501966 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528531 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.528955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.529041 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.534460 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.555664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.555883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.556773 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.557052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/929a21d9-47cd-44cc-b211-258202a86076-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.558424 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.574578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/929a21d9-47cd-44cc-b211-258202a86076-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.588444 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57b4\" (UniqueName: \"kubernetes.io/projected/929a21d9-47cd-44cc-b211-258202a86076-kube-api-access-v57b4\") pod \"ovnkube-control-plane-749d76644c-pnj4s\" (UID: \"929a21d9-47cd-44cc-b211-258202a86076\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.588735 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.613908 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.629713 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630908 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.630990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.631019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.631041 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.645950 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.670487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.672319 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: W0227 00:07:25.690885 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929a21d9_47cd_44cc_b211_258202a86076.slice/crio-63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107 WatchSource:0}: Error finding container 63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107: Status 404 returned error can't find the container with id 63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107 Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.694871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.718143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.734273 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.740975 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.758545 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.781972 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.801654 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.820235 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.837589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.841929 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.866136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.882599 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.903100 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.936375 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941921 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.941982 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:25Z","lastTransitionTime":"2026-02-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:25 crc kubenswrapper[4781]: I0227 00:07:25.960151 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.044946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.045085 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063174 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063449 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063405548 +0000 UTC m=+131.320945142 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063582 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063655 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063693 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063708 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063765 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063745397 +0000 UTC m=+131.321284961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063767 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.063670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063837 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063854 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063832799 +0000 UTC m=+131.321372453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.063854 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064017 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.063981213 +0000 UTC m=+131.321520817 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064053 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064082 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.064152 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:42.064128127 +0000 UTC m=+131.321667801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.100448 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.100890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.100945 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.122469 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.137658 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.147229 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.153853 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.186152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.200752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.218951 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.240858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.249976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.263690 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.266091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.266177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.281260 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.294008 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.305010 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308235 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.308485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308859 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.308940 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.311991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.312034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.312048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" event={"ID":"929a21d9-47cd-44cc-b211-258202a86076","Type":"ContainerStarted","Data":"63eac76b570756aaa554c96657c8ea62bf9ba1f65af88dfe4cac9c28439e8107"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.315486 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.324307 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.325922 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.326090 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.349909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2415d8c30c8c9fd45f8ba46ec6678e04c700743d90a0ef2bfd480457441f78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:22Z\\\",\\\"message\\\":\\\"9 6566 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.358987 6566 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359235 6566 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:22.359580 6566 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 00:07:22.359758 6566 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:22.360687 6566 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:22.360715 6566 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:22.360738 6566 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:22.360814 6566 factory.go:656] Stopping watch factory\\\\nI0227 00:07:22.360840 6566 ovnkube.go:599] Stopped ovnkube\\\\nI0227 00:07:22.360849 6566 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:22.360873 6566 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352177 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.352309 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.364076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.366810 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.366885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.366977 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.367039 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:26.867022523 +0000 UTC m=+116.124562077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.377698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.384538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9s2\" (UniqueName: \"kubernetes.io/projected/e866e388-01ab-407a-a59b-d0ba6c3f6f22-kube-api-access-db9s2\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.387938 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.401447 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.416613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.425475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.440647 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455096 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.455139 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.456658 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.469449 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.490688 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.502475 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.516721 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.528544 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.542543 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.554977 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.559194 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.575556 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.592852 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.607797 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.621770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.661940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.662113 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765044 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765546 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.765695 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.869482 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.872285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.872452 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: E0227 00:07:26.872530 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:27.872508578 +0000 UTC m=+117.130048172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:26 crc kubenswrapper[4781]: I0227 00:07:26.972411 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:26Z","lastTransitionTime":"2026-02-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075549 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.075589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.178246 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281315 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281362 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.281392 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.384101 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.487242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.591247 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.695810 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.799345 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.887184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:27 crc kubenswrapper[4781]: E0227 00:07:27.887435 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:27 crc kubenswrapper[4781]: E0227 00:07:27.887535 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:29.887509795 +0000 UTC m=+119.145049379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:27 crc kubenswrapper[4781]: I0227 00:07:27.902728 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:27Z","lastTransitionTime":"2026-02-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.005746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.108761 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211918 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.211982 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308995 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.309021 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.308996 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309111 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309452 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309662 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:28 crc kubenswrapper[4781]: E0227 00:07:28.309761 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316491 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.316566 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.419272 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.522762 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.625974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626143 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.626325 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.729869 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.833264 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.935974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:28 crc kubenswrapper[4781]: I0227 00:07:28.936107 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:28Z","lastTransitionTime":"2026-02-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.039352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.143532 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247361 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.247431 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.350483 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.383851 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.407773 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.413709 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.434589 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.440425 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.462277 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.468369 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.487376 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492685 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.492733 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.513056 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.513228 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515697 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.515854 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618735 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.618763 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.720847 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.823653 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.913034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.913204 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:29 crc kubenswrapper[4781]: E0227 00:07:29.913300 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:33.913270274 +0000 UTC m=+123.170809878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.926937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.926997 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:29 crc kubenswrapper[4781]: I0227 00:07:29.927069 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:29Z","lastTransitionTime":"2026-02-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.030329 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133428 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.133584 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.236762 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.308868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.308704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309218 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309311 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:30 crc kubenswrapper[4781]: E0227 00:07:30.309097 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339129 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339198 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.339242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.441296 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.544442 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647287 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.647413 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.750720 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.853704 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956240 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:30 crc kubenswrapper[4781]: I0227 00:07:30.956294 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:30Z","lastTransitionTime":"2026-02-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.058974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.059118 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:31Z","lastTransitionTime":"2026-02-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.162259 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:31Z","lastTransitionTime":"2026-02-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:31 crc kubenswrapper[4781]: E0227 00:07:31.262570 4781 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.342088 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.363616 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.386771 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.412775 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: E0227 00:07:31.419014 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.429755 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.444565 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.462464 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.475973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.489992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.518248 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.563672 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.573229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.592990 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.604874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.615719 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:31 crc kubenswrapper[4781]: I0227 00:07:31.625127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.308998 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309010 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309187 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309029 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309458 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:32 crc kubenswrapper[4781]: I0227 00:07:32.309474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.309552 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:32 crc kubenswrapper[4781]: E0227 00:07:32.310571 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.288914 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.310788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.330387 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.345006 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.375595 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.396940 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.416583 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.432881 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.458030 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.474400 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.492457 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.526373 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.548322 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.569489 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.591361 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.610518 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.628172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:33 crc kubenswrapper[4781]: I0227 00:07:33.954901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:33 crc kubenswrapper[4781]: E0227 00:07:33.955068 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:33 crc kubenswrapper[4781]: E0227 00:07:33.955164 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:41.955141254 +0000 UTC m=+131.212680838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308892 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308915 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:34 crc kubenswrapper[4781]: I0227 00:07:34.308914 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309056 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309294 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309428 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:34 crc kubenswrapper[4781]: E0227 00:07:34.309532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309410 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309343 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:36 crc kubenswrapper[4781]: I0227 00:07:36.309499 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309564 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309731 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309850 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.309913 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:36 crc kubenswrapper[4781]: E0227 00:07:36.420399 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309126 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309250 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.309559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309778 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.309938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:38 crc kubenswrapper[4781]: E0227 00:07:38.310676 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:38 crc kubenswrapper[4781]: I0227 00:07:38.311298 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.377576 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.378690 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/1.log" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382702 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" exitCode=1 Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.382832 4781 scope.go:117] "RemoveContainer" containerID="5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.383970 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.384220 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.401886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.425937 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.446320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.465309 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.482043 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.496962 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.526547 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.544198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.563526 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.583361 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.586580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.606760 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.610506 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.611696 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.629819 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635038 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635117 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.635310 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.650580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.654535 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659379 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.659415 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.667364 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.680784 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.685769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:39Z","lastTransitionTime":"2026-02-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.695152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.704540 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:39 crc kubenswrapper[4781]: E0227 00:07:39.704726 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:39 crc kubenswrapper[4781]: I0227 00:07:39.710194 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:39Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308724 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308807 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.308906 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.308992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309169 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309227 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:40 crc kubenswrapper[4781]: E0227 00:07:40.309264 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:40 crc kubenswrapper[4781]: I0227 00:07:40.390155 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.328161 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.349400 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.365598 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.383172 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.415943 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: E0227 00:07:41.421034 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.438398 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.454453 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.479872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.496873 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.516341 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.532911 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.544256 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.554200 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.584346 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e36ef67f8d32807fbb04ca66b9ed03be84c4fcb06db0a08e0c6a0f257b54195\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:24Z\\\",\\\"message\\\":\\\"g *v1.Pod event handler 3 for removal\\\\nI0227 00:07:24.408545 6749 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 00:07:24.408563 6749 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 00:07:24.408576 6749 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:24.408581 6749 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:24.408591 6749 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 00:07:24.408618 6749 factory.go:656] Stopping watch factory\\\\nI0227 00:07:24.408501 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0227 00:07:24.408673 6749 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:24.408684 6749 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 00:07:24.408611 6749 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 00:07:24.408708 6749 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 00:07:24.408782 6749 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 00:07:24.408880 6749 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:24.408980 6749 ovnkube.go:599] Stopped ovnkube\\\\nI0227 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.607871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:41 crc kubenswrapper[4781]: I0227 00:07:41.622977 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.046465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.046733 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.046859 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:07:58.046830029 +0000 UTC m=+147.304369613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147436 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147403234 +0000 UTC m=+163.404942828 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147480 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147493 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147610 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147586249 +0000 UTC m=+163.405125833 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.147672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147827 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147883 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.147868737 +0000 UTC m=+163.405408331 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147968 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.147992 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148010 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148057 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.148042551 +0000 UTC m=+163.405582135 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148137 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148189 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148205 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.148248 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:14.148235707 +0000 UTC m=+163.405775291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309089 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309149 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309172 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.309090 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.309438 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.309566 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.310252 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:42 crc kubenswrapper[4781]: E0227 00:07:42.310763 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:42 crc kubenswrapper[4781]: I0227 00:07:42.321930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.245409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.247081 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:43 crc kubenswrapper[4781]: E0227 00:07:43.247375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.271222 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.299264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.313414 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.327872 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.354286 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.372777 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.386051 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.400714 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.413644 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.425442 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.438898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.448962 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.469535 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.480936 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.490528 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.501532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:43 crc kubenswrapper[4781]: I0227 00:07:43.512192 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.308611 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308668 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.308777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.308867 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.309013 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:44 crc kubenswrapper[4781]: I0227 00:07:44.309322 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:44 crc kubenswrapper[4781]: E0227 00:07:44.309484 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308733 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308804 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:46 crc kubenswrapper[4781]: I0227 00:07:46.308924 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.308915 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309205 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.309370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:46 crc kubenswrapper[4781]: E0227 00:07:46.422128 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308941 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.309060 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.308963 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.309211 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.309296 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:48 crc kubenswrapper[4781]: I0227 00:07:48.308802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:48 crc kubenswrapper[4781]: E0227 00:07:48.310365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:49 crc kubenswrapper[4781]: I0227 00:07:49.999458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:49 crc kubenswrapper[4781]: I0227 00:07:49.999934 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:49.999955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:49.999983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.000002 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:49Z","lastTransitionTime":"2026-02-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.018283 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.024221 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.044425 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048916 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.048940 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.068204 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073334 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.073345 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.093126 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.097884 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:07:50Z","lastTransitionTime":"2026-02-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.118273 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.118506 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309163 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:50 crc kubenswrapper[4781]: I0227 00:07:50.309237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309339 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:50 crc kubenswrapper[4781]: E0227 00:07:50.309415 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.332992 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.355932 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.373806 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.391123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.408843 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: E0227 00:07:51.422731 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.438264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.456624 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.472355 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.488613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.507253 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.521964 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.533766 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.548953 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.578887 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.595677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.614991 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:51 crc kubenswrapper[4781]: I0227 00:07:51.640150 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309231 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309262 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.309954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309358 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310357 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:52 crc kubenswrapper[4781]: I0227 00:07:52.309294 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310772 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:52 crc kubenswrapper[4781]: E0227 00:07:52.310003 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:53 crc kubenswrapper[4781]: I0227 00:07:53.326144 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309033 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309038 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:54 crc kubenswrapper[4781]: I0227 00:07:54.309178 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309384 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309752 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:54 crc kubenswrapper[4781]: E0227 00:07:54.309916 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308569 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308714 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:56 crc kubenswrapper[4781]: I0227 00:07:56.308671 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.308832 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.308963 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.309103 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.309273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:56 crc kubenswrapper[4781]: E0227 00:07:56.424527 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.131735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.131942 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.132060 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:08:30.1320274 +0000 UTC m=+179.389566984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309276 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309291 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.309586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.309845 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.310073 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.310755 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:07:58 crc kubenswrapper[4781]: I0227 00:07:58.311153 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:07:58 crc kubenswrapper[4781]: E0227 00:07:58.311465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308928 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.308823 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309098 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.309304 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.309436 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.441953 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.442064 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.456529 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460744 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.460757 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468806 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468854 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" exitCode=1 Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.468885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.469261 4781 scope.go:117] "RemoveContainer" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.473384 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.477460 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.485351 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.491004 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495753 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495777 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.495796 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.500039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.509354 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.512850 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513377 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.513405 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:00Z","lastTransitionTime":"2026-02-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.526256 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: E0227 00:08:00.526436 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.535078 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.548152 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.566123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.577668 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.587995 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.598561 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.613374 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.623027 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.632283 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.648830 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.664662 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.675854 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.688131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.701674 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:00 crc kubenswrapper[4781]: I0227 00:08:00.712127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.327898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.344405 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.360765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.383261 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.399242 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.414563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: E0227 00:08:01.425244 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.434431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.445684 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.459229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.472859 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.472905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606"} Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.476818 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.496677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.515916 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.533377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.551396 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.569009 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.582053 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.595886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.618147 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.629709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.639607 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.659540 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.676767 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.693445 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.713618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.728287 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.752522 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.765550 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.778342 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.791934 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.813605 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.833175 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.847752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.868532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.888604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.904131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:01 crc kubenswrapper[4781]: I0227 00:08:01.922276 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309136 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309171 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309159 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:02 crc kubenswrapper[4781]: I0227 00:08:02.309248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309431 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309585 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.309809 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:02 crc kubenswrapper[4781]: E0227 00:08:02.310008 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308762 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:04 crc kubenswrapper[4781]: I0227 00:08:04.308617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.308956 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309069 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:04 crc kubenswrapper[4781]: E0227 00:08:04.309388 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309186 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309225 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309390 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:06 crc kubenswrapper[4781]: I0227 00:08:06.309407 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309555 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309819 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.309938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:06 crc kubenswrapper[4781]: E0227 00:08:06.426797 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308519 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308543 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308674 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:08 crc kubenswrapper[4781]: I0227 00:08:08.308680 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308776 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.308966 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:08 crc kubenswrapper[4781]: E0227 00:08:08.309076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309359 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309467 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.309730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309719 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309792 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.309869 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.860956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.860999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861010 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861028 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.861040 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.877653 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882216 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.882335 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.900851 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.905227 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.922776 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.927909 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.948734 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.953955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:10 crc kubenswrapper[4781]: I0227 00:08:10.954043 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:10Z","lastTransitionTime":"2026-02-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.973379 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:10Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:10 crc kubenswrapper[4781]: E0227 00:08:10.973746 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.321760 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.339124 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.351356 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.361147 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.372296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.382920 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.394525 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.405385 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.415874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: E0227 00:08:11.427336 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.427429 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.444973 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.463123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.474106 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.488603 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.498951 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.513438 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.531089 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:11 crc kubenswrapper[4781]: I0227 00:08:11.547206 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:11Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.308918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309099 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309395 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:12 crc kubenswrapper[4781]: I0227 00:08:12.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309692 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309794 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:12 crc kubenswrapper[4781]: E0227 00:08:12.309923 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.309934 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.522500 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.526823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.527245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.542896 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.555032 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.569059 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.579445 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.599669 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.614301 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.629376 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.645823 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.660517 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.676859 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.700661 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.716643 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.740854 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.775757 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.798961 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.823676 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.843133 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:13 crc kubenswrapper[4781]: I0227 00:08:13.856490 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204171 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204305 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.204379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204487 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204539 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204523918 +0000 UTC m=+227.462063482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204758 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204747784 +0000 UTC m=+227.462287358 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204824 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204837 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204852 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204881 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.204872837 +0000 UTC m=+227.462412401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204894 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204928 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204966 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.204979 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.205021 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.20499408 +0000 UTC m=+227.462533664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.205052 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:18.205037541 +0000 UTC m=+227.462577125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.308941 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.308983 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.309019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.309083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.309516 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.309905 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.310074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.310102 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.538555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.539558 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/2.log" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.543992 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" exitCode=1 Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.544045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.544099 4781 scope.go:117] "RemoveContainer" containerID="cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.545140 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:14 crc kubenswrapper[4781]: E0227 00:08:14.545409 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.573136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.591785 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.607171 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.624529 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.637759 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.654393 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.668958 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.683530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.696941 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.713568 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.728277 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.745712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.760129 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.771249 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.792379 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca1aca847263f3c2df2b4b932332517a23a3b712a16a039c9d911ab7f74c807\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:39Z\\\",\\\"message\\\":\\\"ctor.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277240 7001 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 00:07:39.277444 7001 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 00:07:39.277696 7001 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 00:07:39.277746 7001 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 00:07:39.277820 7001 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 00:07:39.277868 7001 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 00:07:39.277886 7001 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 00:07:39.277904 7001 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 00:07:39.277900 7001 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 00:07:39.277925 7001 factory.go:656] Stopping watch factory\\\\nI0227 00:07:39.277924 7001 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 00:07:39.277933 7001 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 00:07:39.277947 7001 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.807788 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.822050 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:14 crc kubenswrapper[4781]: I0227 00:08:14.834342 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:14Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.548115 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.552258 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:15 crc kubenswrapper[4781]: E0227 00:08:15.552532 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.567890 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.580290 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.591184 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.603074 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.613933 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.626886 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.638699 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.647077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.664294 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.674573 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.685971 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.695752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.715143 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.728137 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.742114 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.757666 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.767211 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:15 crc kubenswrapper[4781]: I0227 00:08:15.778427 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309299 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309260 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309411 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:16 crc kubenswrapper[4781]: I0227 00:08:16.309386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309568 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309606 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.309684 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:16 crc kubenswrapper[4781]: E0227 00:08:16.429158 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308798 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309321 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308991 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:18 crc kubenswrapper[4781]: I0227 00:08:18.308925 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:18 crc kubenswrapper[4781]: E0227 00:08:18.309700 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309317 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309331 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:20 crc kubenswrapper[4781]: I0227 00:08:20.309214 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309531 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:20 crc kubenswrapper[4781]: E0227 00:08:20.309696 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.319942 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d2xt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4365da31-2d17-4b58-bb27-bd47b5133a8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e11b23da103dd8f68b17e0fe31d94f0191a22fc149732f80bc0ad6b9e33452\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dhwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d2xt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.346436 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:08:14Z\\\",\\\"message\\\":\\\"ch for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0227 00:08:14.171587 7345 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:08:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5qlg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d2zn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.347251 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.363876 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8daa2a3-b955-4821-8179-45f9c2f35e9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3889beb7621e75bd03228c9b04e770fea726761853b41eccbb6272ee0e5d21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32830bc550c9e0ba52cce04b34a09da88eb495a00d5cf27b85ee7a4a76cd494a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 00:05:33.165391 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 00:05:33.168776 1 observer_polling.go:159] Starting file observer\\\\nI0227 00:05:33.203828 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 00:05:33.209843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0227 00:06:03.600797 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:06:02Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:06:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d5502a71e2eb41a23613647b53e5e218f6217a28932a75c18b20230d224d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b8a07875d5e307a11567c51fbea52305add97506cdcbcabc73603448f40a0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.366931 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370868 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.370882 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.381164 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4bb01ad4-bdad-4837-a06a-c07cce38a60b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e484c55cbd8737198a15b569a6bea881cbd859019d4ad41402765d0a8922fd2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb69ca988df73334f28c605e94f042976cc3107e97b8a6e7152c6fa400bc214a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://466063a9fa3266b8383f8443b5c0dd0851f32f784e3ff1bf116b686e0a0dd326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4678e4843ba89f9104e9867ce9610eacb378ba5edf6c59121d6fefd3294bd3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.383676 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.387745 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.404377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.416940 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.421816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.422146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.422337 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.426825 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db827a6616608314c894ea1ba8bda786d43e055f527006f8be8a1d3cd085db84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.429887 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.441557 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.445215 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.449100 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:21Z","lastTransitionTime":"2026-02-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.463116 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf8a5c4633300f9d6540276dea7b4640a69a81ae81957d156254b83f8345995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9745f9b4113f0e717dc7038309f7987ea158e0465919bf1d440e0084815abd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.467185 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6d1802c0-d9dd-4bd7-99c8-bbe950fe4246\\\",\\\"systemUUID\\\":\\\"673e9d6f-5525-49c7-9d73-70585e17af5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: E0227 00:08:21.467429 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.480210 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e866e388-01ab-407a-a59b-d0ba6c3f6f22\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-db9s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpnjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.498618 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"929a21d9-47cd-44cc-b211-258202a86076\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ab7deecc4eab89bddcac671d7a66c01758c3f9acab29f7e763badfc8146ebbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d1c61868c1a2527159e483863b112534729ff1c638798f94e2a657dd8a6b25f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v57b4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pnj4s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.530858 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2aef704-6d67-4fe1-a598-b84c099c45f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38b7903a647dfbfd5e831ad328e7e0e0da1ca9a044fd7f9c7c788700d7e7bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4acb0830941cae69b9244cf99e67031c4183d78a72dcc4e8b225ec5bc1ef308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de36cafa7bb8b89dbc14a43c9949436efb48316c22dffdcee57db86398259f53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a89e199146ad34ac3d0a41805072282d1d6e9a5c7c1ac2fd243b0b072c152e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61fb5df960c37979d7aae5105e5124c2fcf84b361a5a217e98015cf741195d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f1cd77b07d6451ef3ec9fd36a9186afc3858d7b32e4156cb79ce83ba86415b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c80e04046083e4cf136fdf6a3b8176eadd4bea27f5aa78fe007826f9903e917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1178a70914d6393d81625157bc8f5fb12b362c1f8ce057ca233420195e9bb4ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.546502 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4018277d-2fc3-40ed-937a-cea43dacb894\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T00:06:34Z\\\",\\\"message\\\":\\\"file observer\\\\nW0227 00:06:34.119146 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 00:06:34.119271 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 00:06:34.120007 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-290240189/tls.crt::/tmp/serving-cert-290240189/tls.key\\\\\\\"\\\\nI0227 00:06:34.447060 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 00:06:34.450077 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 00:06:34.450099 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 00:06:34.450126 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 00:06:34.450133 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 00:06:34.457740 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 00:06:34.457802 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 00:06:34.457897 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457956 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 00:06:34.457991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 00:06:34.458043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 00:06:34.458082 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 00:06:34.458143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 00:06:34.460554 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:06:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:05:34Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:05:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:05:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:05:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.560255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tlstj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T00:07:59Z\\\",\\\"message\\\":\\\"2026-02-27T00:07:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852\\\\n2026-02-27T00:07:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2cef12f2-df17-4376-85e9-9a33f8cb4852 to /host/opt/cni/bin/\\\\n2026-02-27T00:07:14Z [verbose] multus-daemon started\\\\n2026-02-27T00:07:14Z [verbose] Readiness Indicator file check\\\\n2026-02-27T00:07:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-49nxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tlstj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.576246 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f348e07-ea87-45b6-8f2b-6e1b08eda780\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://606c34b944cbd0f8717d746c9f71b609a1b4a61ece7d2f27b00dd4f622288e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e66abcb52b54c8742e65b3788a8ca714b6fae5ce4c4377d73cc08d78812e08b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3551a80285f85db7e44432f6a895c980e22a084442bf9d03737a39b421487bca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a99a6791a5dfea3c51cc709a72bffb2c921209e465fd6889963c9d0488afc1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cf1a8674bac3d2bc34f6e9c2432fa621ed8d07bd55b20f6a40353daf60a1c6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08170195092f4070f130f0510d15c12e304ed8e467db854604994e6c1afc6acd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74585270eebbc1a42bcf8a3a41d5bdcdb9c6af2ef95b68fea3512ba838e38245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x98vj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2k4zf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.586447 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rc856" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a60df0eb-b7b5-4b83-8d09-43fcd7c63ab2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3da8823bf1ec2351c9b2793e1f6cdd6acb1b1766d043b112ed7e780f75b62d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rq5gj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rc856\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.600281 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8329c270a54485e2d1102b1850d2dae79982254a4ee635bb1745de96f7cd544d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.613782 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:21 crc kubenswrapper[4781]: I0227 00:08:21.628672 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32c19e2e-0830-47a5-9ea8-862e1c9d8571\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T00:07:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb166edd22c217067a22a9536260c4cb5ac1e215aba1985233ed538109fcf9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T00:07:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj8st\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T00:07:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v6fnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308697 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308712 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:22 crc kubenswrapper[4781]: I0227 00:08:22.308871 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309137 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309241 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309329 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:22 crc kubenswrapper[4781]: E0227 00:08:22.309370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:23 crc kubenswrapper[4781]: I0227 00:08:23.324236 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.309496 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309594 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:24 crc kubenswrapper[4781]: I0227 00:08:24.309870 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.309917 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.310002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:24 crc kubenswrapper[4781]: E0227 00:08:24.310203 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:26 crc kubenswrapper[4781]: I0227 00:08:26.309059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309527 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309668 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.309835 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:26 crc kubenswrapper[4781]: E0227 00:08:26.435464 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309490 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:28 crc kubenswrapper[4781]: I0227 00:08:28.309332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:28 crc kubenswrapper[4781]: E0227 00:08:28.309769 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:29 crc kubenswrapper[4781]: I0227 00:08:29.310896 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:29 crc kubenswrapper[4781]: E0227 00:08:29.311174 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.182459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.182576 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.182654 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs podName:e866e388-01ab-407a-a59b-d0ba6c3f6f22 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:34.182611403 +0000 UTC m=+243.440150957 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs") pod "network-metrics-daemon-kpnjj" (UID: "e866e388-01ab-407a-a59b-d0ba6c3f6f22") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309506 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309718 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309745 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309817 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:30 crc kubenswrapper[4781]: I0227 00:08:30.309879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.309886 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:30 crc kubenswrapper[4781]: E0227 00:08:30.310114 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.334709 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d2xt9" podStartSLOduration=125.334679968 podStartE2EDuration="2m5.334679968s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.332184603 +0000 UTC m=+180.589724157" watchObservedRunningTime="2026-02-27 00:08:31.334679968 +0000 UTC m=+180.592219562" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.378399 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.378377497 podStartE2EDuration="38.378377497s" podCreationTimestamp="2026-02-27 00:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.37774199 +0000 UTC m=+180.635281574" watchObservedRunningTime="2026-02-27 00:08:31.378377497 +0000 UTC m=+180.635917061" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.390719 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.390675274 podStartE2EDuration="49.390675274s" podCreationTimestamp="2026-02-27 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.390316095 +0000 UTC m=+180.647855649" watchObservedRunningTime="2026-02-27 00:08:31.390675274 +0000 UTC m=+180.648214828" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.421901 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.421881911 podStartE2EDuration="8.421881911s" podCreationTimestamp="2026-02-27 00:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.42186034 +0000 UTC m=+180.679399954" watchObservedRunningTime="2026-02-27 00:08:31.421881911 +0000 UTC m=+180.679421465" Feb 27 00:08:31 crc kubenswrapper[4781]: E0227 00:08:31.436913 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484251 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.484474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.486311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.486366 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T00:08:31Z","lastTransitionTime":"2026-02-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.523407 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pnj4s" podStartSLOduration=125.523387683 podStartE2EDuration="2m5.523387683s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.522423908 +0000 UTC m=+180.779963462" watchObservedRunningTime="2026-02-27 00:08:31.523387683 +0000 UTC m=+180.780927237" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.526267 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv"] Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.526731 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530028 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530159 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530227 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.530309 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.547831 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.547815464 podStartE2EDuration="1m10.547815464s" podCreationTimestamp="2026-02-27 00:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.547235819 +0000 UTC m=+180.804775373" watchObservedRunningTime="2026-02-27 00:08:31.547815464 +0000 UTC m=+180.805355018" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.570978 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.570948142 podStartE2EDuration="1m11.570948142s" podCreationTimestamp="2026-02-27 00:07:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.57085793 +0000 UTC m=+180.828397484" watchObservedRunningTime="2026-02-27 00:08:31.570948142 +0000 UTC m=+180.828487696" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.584390 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tlstj" podStartSLOduration=125.584367389 podStartE2EDuration="2m5.584367389s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.583519017 +0000 UTC m=+180.841058581" watchObservedRunningTime="2026-02-27 00:08:31.584367389 +0000 UTC m=+180.841906953" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.596983 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.611248 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rc856" podStartSLOduration=125.611231683 podStartE2EDuration="2m5.611231683s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.610904714 +0000 UTC m=+180.868444298" watchObservedRunningTime="2026-02-27 00:08:31.611231683 +0000 UTC m=+180.868771267" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.611800 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2k4zf" podStartSLOduration=125.611790417 podStartE2EDuration="2m5.611790417s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.601395979 +0000 UTC m=+180.858935553" watchObservedRunningTime="2026-02-27 00:08:31.611790417 +0000 UTC m=+180.869329981" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.661298 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podStartSLOduration=125.661276756 podStartE2EDuration="2m5.661276756s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:31.651070522 +0000 UTC m=+180.908610086" watchObservedRunningTime="2026-02-27 00:08:31.661276756 +0000 UTC m=+180.918816310" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4dd29f3f-2201-4879-a479-3f6a0ed912a5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.698451 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.699276 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4dd29f3f-2201-4879-a479-3f6a0ed912a5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.703790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd29f3f-2201-4879-a479-3f6a0ed912a5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.714868 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dd29f3f-2201-4879-a479-3f6a0ed912a5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk2sv\" (UID: \"4dd29f3f-2201-4879-a479-3f6a0ed912a5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: I0227 00:08:31.839247 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" Feb 27 00:08:31 crc kubenswrapper[4781]: W0227 00:08:31.863467 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd29f3f_2201_4879_a479_3f6a0ed912a5.slice/crio-bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7 WatchSource:0}: Error finding container bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7: Status 404 returned error can't find the container with id bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7 Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308833 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309334 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308922 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308900 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309476 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.308984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309908 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:32 crc kubenswrapper[4781]: E0227 00:08:32.309862 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.401337 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.412179 4781 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.629353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" event={"ID":"4dd29f3f-2201-4879-a479-3f6a0ed912a5","Type":"ContainerStarted","Data":"949e7e480620353250e0403b1a0fb8c3d204ec52dd6e02c407deef70af34a2ba"} Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.629428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" event={"ID":"4dd29f3f-2201-4879-a479-3f6a0ed912a5","Type":"ContainerStarted","Data":"bbb84661c7519ddd74baa960cadc55bf8c7757bc3ee10e5d9364a41bbccba7c7"} Feb 27 00:08:32 crc kubenswrapper[4781]: I0227 00:08:32.642442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk2sv" podStartSLOduration=126.642426224 podStartE2EDuration="2m6.642426224s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:32.641964242 +0000 UTC m=+181.899503826" watchObservedRunningTime="2026-02-27 00:08:32.642426224 +0000 UTC m=+181.899965798" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308678 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.308809 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.308945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.309049 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.309299 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:34 crc kubenswrapper[4781]: I0227 00:08:34.309834 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:34 crc kubenswrapper[4781]: E0227 00:08:34.310001 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308413 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308459 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308565 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:36 crc kubenswrapper[4781]: I0227 00:08:36.308614 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308718 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.308938 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.309013 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:36 crc kubenswrapper[4781]: E0227 00:08:36.438576 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308374 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308445 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308582 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308752 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:38 crc kubenswrapper[4781]: I0227 00:08:38.308806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308920 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:38 crc kubenswrapper[4781]: E0227 00:08:38.308958 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309306 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309421 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309455 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:40 crc kubenswrapper[4781]: I0227 00:08:40.309449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.309828 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310300 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:40 crc kubenswrapper[4781]: E0227 00:08:40.310601 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:41 crc kubenswrapper[4781]: E0227 00:08:41.439219 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308654 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:42 crc kubenswrapper[4781]: I0227 00:08:42.308673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.308891 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309040 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309125 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:42 crc kubenswrapper[4781]: E0227 00:08:42.309337 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:43 crc kubenswrapper[4781]: I0227 00:08:43.310249 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:43 crc kubenswrapper[4781]: E0227 00:08:43.310470 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d2zn6_openshift-ovn-kubernetes(12a87c22-b4e1-4aa9-8b3e-a34f7d159239)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.308975 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.309055 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309157 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.308997 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:44 crc kubenswrapper[4781]: I0227 00:08:44.309271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309570 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309888 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:44 crc kubenswrapper[4781]: E0227 00:08:44.309754 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.308841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.308974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.309012 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309009 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.309077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309238 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309363 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.309457 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.440874 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679235 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679890 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/0.log" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679950 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" exitCode=1 Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.679986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606"} Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.680030 4781 scope.go:117] "RemoveContainer" containerID="3fdad3919a314b39886e97d7bf8da93518856ac166aa080ebc70df5d01bf8608" Feb 27 00:08:46 crc kubenswrapper[4781]: I0227 00:08:46.680794 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:08:46 crc kubenswrapper[4781]: E0227 00:08:46.681382 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:08:47 crc kubenswrapper[4781]: I0227 00:08:47.692704 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309233 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309406 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:48 crc kubenswrapper[4781]: I0227 00:08:48.309428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309481 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309563 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:48 crc kubenswrapper[4781]: E0227 00:08:48.309827 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308607 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308741 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.308795 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:50 crc kubenswrapper[4781]: I0227 00:08:50.308821 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.308932 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.309029 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:50 crc kubenswrapper[4781]: E0227 00:08:50.309148 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:51 crc kubenswrapper[4781]: E0227 00:08:51.441566 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308715 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308755 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:52 crc kubenswrapper[4781]: I0227 00:08:52.308812 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.308717 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.308927 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.309101 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:52 crc kubenswrapper[4781]: E0227 00:08:52.309193 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308720 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.308912 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:54 crc kubenswrapper[4781]: I0227 00:08:54.308739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309091 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309203 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:54 crc kubenswrapper[4781]: E0227 00:08:54.309289 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.309858 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.728394 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.732577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerStarted","Data":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:08:55 crc kubenswrapper[4781]: I0227 00:08:55.732939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309178 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309282 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309371 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309500 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.309582 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309716 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.309837 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.310023 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.384329 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podStartSLOduration=150.384300853 podStartE2EDuration="2m30.384300853s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:08:55.783938069 +0000 UTC m=+205.041477633" watchObservedRunningTime="2026-02-27 00:08:56.384300853 +0000 UTC m=+205.641840457" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.385710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.442828 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:08:56 crc kubenswrapper[4781]: I0227 00:08:56.736173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:56 crc kubenswrapper[4781]: E0227 00:08:56.736314 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309500 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309408 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:08:58 crc kubenswrapper[4781]: I0227 00:08:58.309400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309589 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309905 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.309948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:08:58 crc kubenswrapper[4781]: E0227 00:08:58.310076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.309585 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.755140 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:08:59 crc kubenswrapper[4781]: I0227 00:08:59.755602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129"} Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309277 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:00 crc kubenswrapper[4781]: I0227 00:09:00.309220 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309430 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpnjj" podUID="e866e388-01ab-407a-a59b-d0ba6c3f6f22" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309695 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 00:09:00 crc kubenswrapper[4781]: E0227 00:09:00.309874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.879205 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.931915 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.933098 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.936947 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.937818 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.942754 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.943546 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945065 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.945777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.946022 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948254 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948587 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948885 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948957 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949188 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949402 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.948905 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.949706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950450 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950593 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950761 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.950886 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951078 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951255 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.951989 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.952148 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.955803 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956008 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956175 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956513 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956554 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956727 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.956932 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.957307 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958247 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958477 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.958723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.962595 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.968588 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.969336 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.969923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.970675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.973698 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.974273 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.975014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.982738 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983107 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983261 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983411 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983556 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983713 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.983864 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.984841 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.990707 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991147 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991402 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991553 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991736 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.991905 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992023 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992145 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992263 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992432 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992549 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992692 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.992809 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.993427 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.993903 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:01 crc kubenswrapper[4781]: I0227 00:09:01.997163 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.008664 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.015521 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.018996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071376 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.071938 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.072198 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.072348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078203 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078217 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078592 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078610 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078647 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078721 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079182 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079394 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079493 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.079724 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.078726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080238 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080299 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080354 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080577 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080910 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080919 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.080964 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.081166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082229 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082939 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.082962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.083268 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.088511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.088969 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.090901 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.091531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.096663 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.097193 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.099776 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100220 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8lcg4"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100407 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.100945 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101218 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101248 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101259 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101578 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.101664 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.104965 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105218 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.105737 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.108939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109241 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109784 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.109919 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.110218 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.112589 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.112747 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.113248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.114969 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.115197 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.115365 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.117728 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.121202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122044 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122506 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.122816 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123017 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123169 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123305 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123426 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123584 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123742 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.123848 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124131 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124253 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.124464 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128200 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128352 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.128991 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129317 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.129500 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.130197 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.135276 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.148175 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.149456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.150224 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.154912 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.157843 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.158942 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.163382 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164289 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164643 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164656 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.164902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.165160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.165764 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166314 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166486 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.166867 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167647 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.167789 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.168473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.168675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.169397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.170816 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.171210 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.172204 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.172591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.175358 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.175952 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.176265 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.177057 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.177546 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.178510 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.179416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.179896 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.180492 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182501 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182522 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182569 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182585 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182604 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184013 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-policies\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.183819 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184030 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184114 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184251 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184305 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184418 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184440 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184550 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184666 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184717 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184783 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184812 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184856 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.184971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185073 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.182758 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-node-pullsecrets\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b9dadb6a-e49e-4473-8338-3af567aacb4a-audit-dir\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185364 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185382 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185714 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-auth-proxy-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186085 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186164 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186214 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb172836-9833-43d5-a99b-cc01b3dd6694-config\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186349 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186381 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.186934 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187332 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187843 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-service-ca\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.185519 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-config\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.187933 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.188050 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d9ce11ed-3022-47e0-8150-8af94af65076-audit-dir\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.188705 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-serving-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189458 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.189841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.190008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.190150 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.191899 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192223 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-audit\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.192500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.194803 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195494 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-etcd-client\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195571 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.195970 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.196270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-serving-cert\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-etcd-client\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-serving-cert\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.197570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb172836-9833-43d5-a99b-cc01b3dd6694-machine-approver-tls\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.198231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.198778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-encryption-config\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d9ce11ed-3022-47e0-8150-8af94af65076-image-import-ca\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200860 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.200894 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.201939 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.203378 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.204757 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.205687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-serving-cert\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.206062 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d9ce11ed-3022-47e0-8150-8af94af65076-etcd-client\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.206876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b9dadb6a-e49e-4473-8338-3af567aacb4a-encryption-config\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.216374 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.223344 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.226888 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.239152 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.242096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.244934 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.246968 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.249813 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.251280 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.259948 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.260056 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.261574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.263927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.264997 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.266097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.267429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.268563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.269749 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.271129 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.272580 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.274766 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.274843 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.276212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.276837 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.279240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.280491 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.282549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.286983 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287470 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287570 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287592 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287764 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287891 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.287997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288088 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288167 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288276 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14579b3e-131e-4e98-b060-a93d2581479c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-images\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/878b625f-d8df-457f-b208-f4bf5807a8d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.288973 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6497cf4e-c461-4db9-88e4-5de2a5f28404-tmpfs\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.289793 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-config\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-serving-cert\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291445 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-trusted-ca\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.291504 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sl77b"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292081 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14579b3e-131e-4e98-b060-a93d2581479c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292415 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-config\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f3571fd-ce1b-4105-9100-020fd1cd5076-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.292646 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.293181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/878b625f-d8df-457f-b208-f4bf5807a8d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.293391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.294042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.295241 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.295521 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.297843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.299282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.300607 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.301704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.302783 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.303912 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.304929 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.306321 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.306465 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308523 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.308619 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.315410 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.337190 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.355193 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.375072 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.395762 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.415416 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.435854 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.454825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.476769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.494825 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.515977 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.536168 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.555900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.575871 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.595680 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.615773 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.635514 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.656187 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.676059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.684539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4a870d-8cda-423b-a15b-391830c944f4-metrics-tls\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.695158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.723071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.730972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac4a870d-8cda-423b-a15b-391830c944f4-trusted-ca\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.735204 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.754902 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.775858 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.794897 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.815095 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.855886 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.875096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.885550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/010c6a41-8e2d-4391-ac1b-82814dad98a4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.895415 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.916059 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.935577 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.955829 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.964319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:02 crc kubenswrapper[4781]: I0227 00:09:02.976040 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.006777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.011969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.015847 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.036432 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.057258 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.075174 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.096368 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.099759 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.115991 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.136095 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.144401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13b9671c-f825-49de-913c-42e8d161f7f8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.155565 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.160785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13b9671c-f825-49de-913c-42e8d161f7f8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.174664 4781 request.go:700] Waited for 1.006972722s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-operator-dockercfg-2bh8d&limit=500&resourceVersion=0 Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.177127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.196812 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.216353 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.236013 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.255408 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.279250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.284322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-srv-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287872 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288090 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert podName:ae09caff-6233-41f8-bb7d-a2314363e2fa nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788066588 +0000 UTC m=+213.045606192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert") pod "olm-operator-6b444d44fb-mmx87" (UID: "ae09caff-6233-41f8-bb7d-a2314363e2fa") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287882 4781 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls podName:98d3eede-8852-4bf5-a905-25974e47445f nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788349864 +0000 UTC m=+213.045889498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls") pod "machine-config-operator-74547568cd-rhhqx" (UID: "98d3eede-8852-4bf5-a905-25974e47445f") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.287961 4781 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288740 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle podName:44e0d81c-a6e7-4e95-9901-ea32b8476755 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.788724143 +0000 UTC m=+213.046263777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle") pod "service-ca-9c57cc56f-kxcrw" (UID: "44e0d81c-a6e7-4e95-9901-ea32b8476755") : failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288778 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289017 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert podName:6497cf4e-c461-4db9-88e4-5de2a5f28404 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789001669 +0000 UTC m=+213.046541313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert") pod "packageserver-d55dfcdfc-sw7s5" (UID: "6497cf4e-c461-4db9-88e4-5de2a5f28404") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288905 4781 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289284 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images podName:98d3eede-8852-4bf5-a905-25974e47445f nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789269435 +0000 UTC m=+213.046809079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images") pod "machine-config-operator-74547568cd-rhhqx" (UID: "98d3eede-8852-4bf5-a905-25974e47445f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.288918 4781 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289535 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert podName:6497cf4e-c461-4db9-88e4-5de2a5f28404 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.789518601 +0000 UTC m=+213.047058235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert") pod "packageserver-d55dfcdfc-sw7s5" (UID: "6497cf4e-c461-4db9-88e4-5de2a5f28404") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.289978 4781 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290054 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key podName:44e0d81c-a6e7-4e95-9901-ea32b8476755 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.790041103 +0000 UTC m=+213.047580727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key") pod "service-ca-9c57cc56f-kxcrw" (UID: "44e0d81c-a6e7-4e95-9901-ea32b8476755") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290241 4781 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: E0227 00:09:03.290406 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert podName:8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237 nodeName:}" failed. No retries permitted until 2026-02-27 00:09:03.790389511 +0000 UTC m=+213.047929135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-6w28d" (UID: "8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237") : failed to sync secret cache: timed out waiting for the condition Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.295217 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.314666 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.334892 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.354965 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.375287 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.395102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.415044 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.435152 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.454788 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.475444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.494959 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.514802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.536061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.555403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.575319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.595328 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.615151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.635744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.654514 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.695056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.714994 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.736481 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.780766 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grq2j\" (UniqueName: \"kubernetes.io/projected/d9ce11ed-3022-47e0-8150-8af94af65076-kube-api-access-grq2j\") pod \"apiserver-76f77b778f-cr2bb\" (UID: \"d9ce11ed-3022-47e0-8150-8af94af65076\") " pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.800148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6dv\" (UniqueName: \"kubernetes.io/projected/cb172836-9833-43d5-a99b-cc01b3dd6694-kube-api-access-tf6dv\") pod \"machine-approver-56656f9798-rw9ls\" (UID: \"cb172836-9833-43d5-a99b-cc01b3dd6694\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806600 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.806785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.807240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/98d3eede-8852-4bf5-a905-25974e47445f-images\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.808160 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-cabundle\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811279 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-apiservice-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44e0d81c-a6e7-4e95-9901-ea32b8476755-signing-key\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.811903 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.812641 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae09caff-6233-41f8-bb7d-a2314363e2fa-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.812960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98d3eede-8852-4bf5-a905-25974e47445f-proxy-tls\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.819851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77s24\" (UniqueName: \"kubernetes.io/projected/3ba2e306-8f79-4e15-8529-f3a16a0fa95f-kube-api-access-77s24\") pod \"etcd-operator-b45778765-f4jxd\" (UID: \"3ba2e306-8f79-4e15-8529-f3a16a0fa95f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.834141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"route-controller-manager-6576b87f9c-swgz7\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.842794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6497cf4e-c461-4db9-88e4-5de2a5f28404-webhook-cert\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.849292 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"controller-manager-879f6c89f-ktjdc\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.855922 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.873660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.875243 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.896047 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.916259 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.935525 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.968360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.983330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.984960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8zb\" (UniqueName: \"kubernetes.io/projected/b9dadb6a-e49e-4473-8338-3af567aacb4a-kube-api-access-9f8zb\") pod \"apiserver-7bbb656c7d-ht6qt\" (UID: \"b9dadb6a-e49e-4473-8338-3af567aacb4a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:03 crc kubenswrapper[4781]: I0227 00:09:03.996954 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.036917 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.037180 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.038126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"console-f9d7485db-vtsxv\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.057359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.072984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.087923 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.089006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.107489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqct\" (UniqueName: \"kubernetes.io/projected/44e0d81c-a6e7-4e95-9901-ea32b8476755-kube-api-access-dkqct\") pod \"service-ca-9c57cc56f-kxcrw\" (UID: \"44e0d81c-a6e7-4e95-9901-ea32b8476755\") " pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.127326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vdh\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-kube-api-access-58vdh\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.132241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxgj\" (UniqueName: \"kubernetes.io/projected/77c54f3f-bdb8-42ff-a466-3bfb1e2d9464-kube-api-access-pzxgj\") pod \"machine-api-operator-5694c8668f-29z97\" (UID: \"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.154873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gbc\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-kube-api-access-k9gbc\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.174648 4781 request.go:700] Waited for 1.886102816s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.174880 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.177799 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"marketplace-operator-79b997595-wgpv7\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.192228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/878b625f-d8df-457f-b208-f4bf5807a8d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lz752\" (UID: \"878b625f-d8df-457f-b208-f4bf5807a8d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.193410 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7332c18_9748_49d2_b512_a46c2d1fcb79.slice/crio-569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1 WatchSource:0}: Error finding container 569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1: Status 404 returned error can't find the container with id 569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.207747 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.209237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.210210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2sq5\" (UniqueName: \"kubernetes.io/projected/13b9671c-f825-49de-913c-42e8d161f7f8-kube-api-access-r2sq5\") pod \"kube-storage-version-migrator-operator-b67b599dd-g2dgp\" (UID: \"13b9671c-f825-49de-913c-42e8d161f7f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.227918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.231271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5ss\" (UniqueName: \"kubernetes.io/projected/14579b3e-131e-4e98-b060-a93d2581479c-kube-api-access-cz5ss\") pod \"openshift-controller-manager-operator-756b6f6bc6-vhsds\" (UID: \"14579b3e-131e-4e98-b060-a93d2581479c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.249273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7llv\" (UniqueName: \"kubernetes.io/projected/ae09caff-6233-41f8-bb7d-a2314363e2fa-kube-api-access-m7llv\") pod \"olm-operator-6b444d44fb-mmx87\" (UID: \"ae09caff-6233-41f8-bb7d-a2314363e2fa\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.267875 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.273502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pgmz\" (UniqueName: \"kubernetes.io/projected/010c6a41-8e2d-4391-ac1b-82814dad98a4-kube-api-access-9pgmz\") pod \"control-plane-machine-set-operator-78cbb6b69f-8d9mv\" (UID: \"010c6a41-8e2d-4391-ac1b-82814dad98a4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.293563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cr2bb"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.293811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djzfc\" (UniqueName: \"kubernetes.io/projected/3f3571fd-ce1b-4105-9100-020fd1cd5076-kube-api-access-djzfc\") pod \"cluster-samples-operator-665b6dd947-tnl79\" (UID: \"3f3571fd-ce1b-4105-9100-020fd1cd5076\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.303325 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.308043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac4a870d-8cda-423b-a15b-391830c944f4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cjjxc\" (UID: \"ac4a870d-8cda-423b-a15b-391830c944f4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.330139 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.332117 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6w28d\" (UID: \"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.349774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5bh\" (UniqueName: \"kubernetes.io/projected/98d3eede-8852-4bf5-a905-25974e47445f-kube-api-access-pq5bh\") pod \"machine-config-operator-74547568cd-rhhqx\" (UID: \"98d3eede-8852-4bf5-a905-25974e47445f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.367937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.374751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwlg\" (UniqueName: \"kubernetes.io/projected/6497cf4e-c461-4db9-88e4-5de2a5f28404-kube-api-access-qnwlg\") pod \"packageserver-d55dfcdfc-sw7s5\" (UID: \"6497cf4e-c461-4db9-88e4-5de2a5f28404\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.395658 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pts\" (UniqueName: \"kubernetes.io/projected/55d8ebfe-a683-40f4-a3ef-bbeadb78ced7-kube-api-access-66pts\") pod \"console-operator-58897d9998-2zw27\" (UID: \"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7\") " pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.399678 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.413152 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-f4jxd"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.416139 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.416168 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.430660 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.435866 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.458298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.475370 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.481258 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.495068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.495819 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.502321 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.508231 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24423db_53f2_4555_81e4_228b3911e144.slice/crio-e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6 WatchSource:0}: Error finding container e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6: Status 404 returned error can't find the container with id e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.515114 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.537475 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.539429 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.550155 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.553304 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.557374 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.560574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.577860 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.596119 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.623533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.636580 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.644803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.658392 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kxcrw"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.658403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.678893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.691590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds"] Feb 27 00:09:04 crc kubenswrapper[4781]: W0227 00:09:04.707501 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e0d81c_a6e7_4e95_9901_ea32b8476755.slice/crio-a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d WatchSource:0}: Error finding container a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d: Status 404 returned error can't find the container with id a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.714063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-29z97"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.727408 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.747937 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748265 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748499 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748517 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748590 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748660 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748677 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748723 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748799 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748910 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748927 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.748992 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749044 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749114 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749154 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749186 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749202 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749319 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749335 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749392 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.749408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.752329 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.252314125 +0000 UTC m=+214.509853669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.775585 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.781054 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.782022 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.783147 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" event={"ID":"14579b3e-131e-4e98-b060-a93d2581479c","Type":"ContainerStarted","Data":"cb59e805734117202d19664eb43966ce8e0467aee64b770dd770da346fa9a444"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.784303 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" event={"ID":"3ba2e306-8f79-4e15-8529-f3a16a0fa95f","Type":"ContainerStarted","Data":"1c300badcad0a7b1d1f35986cff4462b758b04b4b3586b86436a5dec1d1bdfe1"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerStarted","Data":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerStarted","Data":"569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.785738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.788216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerStarted","Data":"e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.789110 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ktjdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.789138 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.792582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerStarted","Data":"e9d5a1980724d143f2a7fb6b4bfe55b32f38b196ca38766a71fb45630ec5a5f0"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.795395 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"012340f634683fb6a06950de9235a76463299205e92e22dd7302613228455891"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.795417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"154a034e255de1d0282d96277f75f0f98ed53321238e0ad1ca74e3a78b581c32"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.796470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" event={"ID":"44e0d81c-a6e7-4e95-9901-ea32b8476755","Type":"ContainerStarted","Data":"a862c59333a3fae30dcb2d5fe0c1a79ccdc0f501b86262bc96c6db90988cbb9d"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799236 4781 generic.go:334] "Generic (PLEG): container finished" podID="d9ce11ed-3022-47e0-8150-8af94af65076" containerID="af1ff5a8ff8c84fc8b96f4c32a6a04d56e630d32b8fbaa75d297c098223eb3db" exitCode=0 Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerDied","Data":"af1ff5a8ff8c84fc8b96f4c32a6a04d56e630d32b8fbaa75d297c098223eb3db"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.799306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"2ecfb860b15c367b713db5be275357775650915a6aaa9a58138869228dd57b2c"} Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.846799 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850500 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.850697 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.350671147 +0000 UTC m=+214.608210701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850924 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.850979 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851079 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851185 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-config\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851555 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.851611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852678 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852758 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.852992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853298 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853470 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853518 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853594 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853832 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853895 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853914 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854087 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.854107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.855374 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.355358274 +0000 UTC m=+214.612897828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.857250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e1c9b213-8c36-4ecf-831f-69a912f6364f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.857798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.858842 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.861045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.863411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.864544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14db9d97-7da5-43c2-8d48-fb435f1a19d0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.853092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.867736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d51c244-aac1-41de-adc4-2393a45392f1-config\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868456 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14db9d97-7da5-43c2-8d48-fb435f1a19d0-proxy-tls\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.868963 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-config\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.871222 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.871878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6846d54c-4d22-46c7-b017-947a3986d773-service-ca-bundle\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-stats-auth\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872385 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf32f77b-92ad-479d-8ee3-423f16089eb6-serving-cert\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.872401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.875542 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.875936 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cbee45f-1bdf-44e9-9782-83340ea69870-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.876538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.878910 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.879407 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.880536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf32f77b-92ad-479d-8ee3-423f16089eb6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.882030 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.888676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.888756 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.893839 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-srv-cert\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c9b213-8c36-4ecf-831f-69a912f6364f-serving-cert\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5b604c3-aa52-42f3-8922-8edee056f016-metrics-tls\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.894533 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.895131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjn5\" (UniqueName: \"kubernetes.io/projected/5b5292e2-0434-46c6-ba9e-33622d4d5cbf-kube-api-access-czjn5\") pod \"openshift-apiserver-operator-796bbdcf4f-72rjz\" (UID: \"5b5292e2-0434-46c6-ba9e-33622d4d5cbf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.895139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sp4hz\" (UID: \"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.896807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.897263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d51c244-aac1-41de-adc4-2393a45392f1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.910035 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnpb\" (UniqueName: \"kubernetes.io/projected/a75bfacf-8cf7-4560-8b4a-6e876daa4c8c-kube-api-access-tfnpb\") pod \"downloads-7954f5f757-qjwrj\" (UID: \"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c\") " pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.927837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.929972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-metrics-certs\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.933918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6846d54c-4d22-46c7-b017-947a3986d773-default-certificate\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.953238 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955192 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955805 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-csi-data-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955829 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955853 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:04 crc kubenswrapper[4781]: E0227 00:09:04.955915 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.455898276 +0000 UTC m=+214.713437830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.955961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-config-volume\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956781 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-mountpoint-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.956918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-plugins-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-socket-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/26e75b38-be64-4f34-933f-731abfe217b6-registration-dir\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.957995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-metrics-tls\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.958149 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldnh\" (UniqueName: \"kubernetes.io/projected/5cbee45f-1bdf-44e9-9782-83340ea69870-kube-api-access-jldnh\") pod \"package-server-manager-789f6589d5-8mth6\" (UID: \"5cbee45f-1bdf-44e9-9782-83340ea69870\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.967544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd798400-ea88-4aad-ae19-815b6b8d57da-cert\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.972190 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsq8c\" (UniqueName: \"kubernetes.io/projected/e1c9b213-8c36-4ecf-831f-69a912f6364f-kube-api-access-hsq8c\") pod \"openshift-config-operator-7777fb866f-fdkct\" (UID: \"e1c9b213-8c36-4ecf-831f-69a912f6364f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.978553 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d"] Feb 27 00:09:04 crc kubenswrapper[4781]: I0227 00:09:04.989019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d51c244-aac1-41de-adc4-2393a45392f1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmmt9\" (UID: \"9d51c244-aac1-41de-adc4-2393a45392f1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.008512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.009467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.026985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4a859-d834-408d-9a9c-4d293b47d95a-config\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.027039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4a859-d834-408d-9a9c-4d293b47d95a-serving-cert\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.027496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.031765 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-certs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.031950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9gg9\" (UniqueName: \"kubernetes.io/projected/396c6e41-89e8-4ecf-ac96-f73aad1f4bbb-kube-api-access-l9gg9\") pod \"migrator-59844c95c7-2xvkz\" (UID: \"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.035320 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/38d33d08-97ce-49cb-b200-8ee30fc09e77-node-bootstrap-token\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.050951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnhbf\" (UniqueName: \"kubernetes.io/projected/14db9d97-7da5-43c2-8d48-fb435f1a19d0-kube-api-access-vnhbf\") pod \"machine-config-controller-84d6567774-cfts2\" (UID: \"14db9d97-7da5-43c2-8d48-fb435f1a19d0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.058112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.058514 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.558500695 +0000 UTC m=+214.816040249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.087694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.090260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"oauth-openshift-558db77b4-2zhrk\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.112967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqfm\" (UniqueName: \"kubernetes.io/projected/bf32f77b-92ad-479d-8ee3-423f16089eb6-kube-api-access-9pqfm\") pod \"authentication-operator-69f744f599-9z8qr\" (UID: \"bf32f77b-92ad-479d-8ee3-423f16089eb6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.114914 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.128507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.131270 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.150243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ncb\" (UniqueName: \"kubernetes.io/projected/d5b604c3-aa52-42f3-8922-8edee056f016-kube-api-access-g4ncb\") pod \"dns-operator-744455d44c-d9gmh\" (UID: \"d5b604c3-aa52-42f3-8922-8edee056f016\") " pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.158750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"collect-profiles-29535840-tfxxm\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.158981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.159302 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.659289193 +0000 UTC m=+214.916828747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.167281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.175917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsc87\" (UniqueName: \"kubernetes.io/projected/7822bd5e-93d1-4f1e-961c-ec0c8a04ab59-kube-api-access-gsc87\") pod \"multus-admission-controller-857f4d67dd-6rw4v\" (UID: \"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.181742 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.189833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"image-pruner-29535840-t9tlz\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.209438 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.210494 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.213241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94m7\" (UniqueName: \"kubernetes.io/projected/6846d54c-4d22-46c7-b017-947a3986d773-kube-api-access-k94m7\") pod \"router-default-5444994796-8lcg4\" (UID: \"6846d54c-4d22-46c7-b017-947a3986d773\") " pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.230255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.231231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx57q\" (UniqueName: \"kubernetes.io/projected/1234fab4-2533-4255-bdc2-dd1c3d3d61b5-kube-api-access-hx57q\") pod \"catalog-operator-68c6474976-7vd5x\" (UID: \"1234fab4-2533-4255-bdc2-dd1c3d3d61b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.248016 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae09caff_6233_41f8_bb7d_a2314363e2fa.slice/crio-326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b WatchSource:0}: Error finding container 326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b: Status 404 returned error can't find the container with id 326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.249185 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98d3eede_8852_4bf5_a905_25974e47445f.slice/crio-0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69 WatchSource:0}: Error finding container 0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69: Status 404 returned error can't find the container with id 0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.257496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.259907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.260324 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.760312085 +0000 UTC m=+215.017851639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.271563 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/26e75b38-be64-4f34-933f-731abfe217b6-kube-api-access-zsp7c\") pod \"csi-hostpathplugin-wdgtd\" (UID: \"26e75b38-be64-4f34-933f-731abfe217b6\") " pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.274389 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.282349 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.291746 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.298212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2zw27"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.302443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.315082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.334114 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8gs\" (UniqueName: \"kubernetes.io/projected/38d33d08-97ce-49cb-b200-8ee30fc09e77-kube-api-access-4k8gs\") pod \"machine-config-server-sl77b\" (UID: \"38d33d08-97ce-49cb-b200-8ee30fc09e77\") " pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.354170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r6k\" (UniqueName: \"kubernetes.io/projected/16f4a859-d834-408d-9a9c-4d293b47d95a-kube-api-access-f2r6k\") pod \"service-ca-operator-777779d784-ksvtc\" (UID: \"16f4a859-d834-408d-9a9c-4d293b47d95a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.355310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"auto-csr-approver-29535848-ccctv\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.362935 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.363507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59cxk\" (UniqueName: \"kubernetes.io/projected/bd798400-ea88-4aad-ae19-815b6b8d57da-kube-api-access-59cxk\") pod \"ingress-canary-574r8\" (UID: \"bd798400-ea88-4aad-ae19-815b6b8d57da\") " pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.363938 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.364367 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.864351447 +0000 UTC m=+215.121891001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.369729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7gw\" (UniqueName: \"kubernetes.io/projected/bde9c2fa-aa41-445a-bfb3-eecde86f5ce5-kube-api-access-dc7gw\") pod \"dns-default-k8qh8\" (UID: \"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5\") " pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.371652 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.448882 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.465799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.466409 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:05.966396094 +0000 UTC m=+215.223935648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.478940 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qjwrj"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.514726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.544780 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.548306 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.556942 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.566404 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.567076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.567862 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.067834656 +0000 UTC m=+215.325374210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.577788 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-574r8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.586587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sl77b" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.596952 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.663497 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.669210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.669664 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.169652247 +0000 UTC m=+215.427191801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.745643 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.774480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.774649 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.27460828 +0000 UTC m=+215.532147834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.775072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.775426 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.275415269 +0000 UTC m=+215.532954823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: W0227 00:09:05.796887 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5292e2_0434_46c6_ba9e_33622d4d5cbf.slice/crio-c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7 WatchSource:0}: Error finding container c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7: Status 404 returned error can't find the container with id c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.810554 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podStartSLOduration=159.810533676 podStartE2EDuration="2m39.810533676s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:05.806382231 +0000 UTC m=+215.063921805" watchObservedRunningTime="2026-02-27 00:09:05.810533676 +0000 UTC m=+215.068073230" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.816978 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2"] Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.851837 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerStarted","Data":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.851902 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.857891 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-swgz7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.857947 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.863037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" event={"ID":"5b5292e2-0434-46c6-ba9e-33622d4d5cbf","Type":"ContainerStarted","Data":"c0acf06a20c749cc1ae544f4c829253a7f8c4162d5a1e577ded11330345f90c7"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.867742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" event={"ID":"878b625f-d8df-457f-b208-f4bf5807a8d8","Type":"ContainerStarted","Data":"49b4a5b9ea767c3f3ded04253c9c548546297c1f4f105cc5c68d051e2afca9c6"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.867787 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" event={"ID":"878b625f-d8df-457f-b208-f4bf5807a8d8","Type":"ContainerStarted","Data":"4cdcfb6cdc8e565b7f9e1f0ea51788665f8bd011272aeedcaa6a097c9b7c5026"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.874499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerStarted","Data":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.874579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerStarted","Data":"52e8848cb853a0dc3b72ab7abe99678676a1a3484d971d2212d9dc7e0814de5c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.875583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.875826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.375803067 +0000 UTC m=+215.633342621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.875955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" event={"ID":"44e0d81c-a6e7-4e95-9901-ea32b8476755","Type":"ContainerStarted","Data":"90ebcf3c9c1ac55298b5363da09070f712031dbaf6a4c00e2fd3545333fb5567"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.876213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.876586 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.376560614 +0000 UTC m=+215.634100168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.877619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" event={"ID":"ae09caff-6233-41f8-bb7d-a2314363e2fa","Type":"ContainerStarted","Data":"326df085f5c1d99acbb7fcfd76f06d7ee211aa1d505e95c60c62d639a1aeac5b"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.878542 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" event={"ID":"010c6a41-8e2d-4391-ac1b-82814dad98a4","Type":"ContainerStarted","Data":"594da34db4ae926dcdf468cf745b0657133ee14da1e7caa9917bf23a62076b90"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.888294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" event={"ID":"9d51c244-aac1-41de-adc4-2393a45392f1","Type":"ContainerStarted","Data":"ab2a19c721b76207a01169cb15c2cd97931ca774c3f35e5701adb61f0b59b53f"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.889579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" event={"ID":"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237","Type":"ContainerStarted","Data":"a61f4755bdafb602d65547200450b5fc07abed6b2954007880e48fa224d12563"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.891257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" event={"ID":"13b9671c-f825-49de-913c-42e8d161f7f8","Type":"ContainerStarted","Data":"d2bb1c7740f71774234caf6902fa0129c33e5b3a02856f65a25645b7a7da84bc"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.891292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" event={"ID":"13b9671c-f825-49de-913c-42e8d161f7f8","Type":"ContainerStarted","Data":"d0fbb5bdc6e055cb4679da36d5719843e846e3bcebb13bedf63f016e29a71e4d"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.893451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"810d3cbf6a4bee635e4bba3ed460e3d254c4823916f364fc323d6d9564666098"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.895698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"ac56aa27a5bf69355adfe6792874ce1ed6ac0d76620dbe06be510d2e7aa2e337"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.897094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"b9acad3269f03e3ff4c82c2d13789f039370f2693e20256519df2b53eb8f050e"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.898891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" event={"ID":"cb172836-9833-43d5-a99b-cc01b3dd6694","Type":"ContainerStarted","Data":"7be9f81687da1d72f53f1f33763c2c529a0559acf3fd1b374e5534b715c61572"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.900038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" event={"ID":"6497cf4e-c461-4db9-88e4-5de2a5f28404","Type":"ContainerStarted","Data":"eba09c3bf14b2017bd020ade926809be8e232218721a795e83f9c8c829c4c279"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.901088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"ba656d94347ab7a46b01e399e4427f40464e1f1733075cff10d80842ea3064c5"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.902136 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9dadb6a-e49e-4473-8338-3af567aacb4a" containerID="fba843b98f1d94d65c7a9a00944c4a2c075200df9aef3220e5fad50a748b7e1c" exitCode=0 Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.902282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerDied","Data":"fba843b98f1d94d65c7a9a00944c4a2c075200df9aef3220e5fad50a748b7e1c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.903365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"9e3065a5ebe5119956c8b7ec1c757a3a6b50906c540cfd667e1750e8d94fbea7"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.904359 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" event={"ID":"14579b3e-131e-4e98-b060-a93d2581479c","Type":"ContainerStarted","Data":"c4d7bf6f0d3b9885760bd255588d753ccdced280441e504b58e960ca2ae484bc"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"0c726a221019f7d1c3295ad3f7aa546ff2c9eec6b98aeb53101e79567375dc69"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905953 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"58dd5894fa816fda9c2863c52d5161febd70739593a31e4d6775f38563b29b0b"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.905975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"f44d631dec8222980dc3c7da844439b422c7b0292e9e92d61e325d0c27b8fe0f"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.907286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" event={"ID":"3ba2e306-8f79-4e15-8529-f3a16a0fa95f","Type":"ContainerStarted","Data":"a05bfa8d2e666094e1a4bda7adf713a5a9fa809cb67207952f2d86a464796379"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"eb45173a1f629c7ad2883098f5964e4563b43bb7bdca30eb6fc3bc6e2ce93911"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.908976 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.909772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw27" event={"ID":"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7","Type":"ContainerStarted","Data":"023ce4349c07e500b74b045fb2be36211c6c6c2639fc1ba445ec76055f8ab82c"} Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.910290 4781 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ktjdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.910327 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.916790 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wgpv7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.916843 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.977042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.977223 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.477196298 +0000 UTC m=+215.734735862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:05 crc kubenswrapper[4781]: I0227 00:09:05.977270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:05 crc kubenswrapper[4781]: E0227 00:09:05.977609 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.477598938 +0000 UTC m=+215.735138512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.033051 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.078325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.079937 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.57991275 +0000 UTC m=+215.837452304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.080517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.084032 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.584012524 +0000 UTC m=+215.841552078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.182475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.183022 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.68298742 +0000 UTC m=+215.940526974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.288215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.288577 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.788563587 +0000 UTC m=+216.046103141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.404696 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.404879 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.90485621 +0000 UTC m=+216.162395764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.405279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.405762 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:06.905738371 +0000 UTC m=+216.163277925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.513600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.515041 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.015021823 +0000 UTC m=+216.272561377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.611233 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fdkct"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.616680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.617028 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.117015989 +0000 UTC m=+216.374555543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.666256 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d9gmh"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.711513 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.717438 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.717802 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.217788636 +0000 UTC m=+216.475328190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.749164 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wdgtd"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.819842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.820538 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.320521018 +0000 UTC m=+216.578060572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.909823 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9z8qr"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.924509 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6rw4v"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.925259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:06 crc kubenswrapper[4781]: E0227 00:09:06.925603 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.425568063 +0000 UTC m=+216.683107617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.943283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" event={"ID":"5b5292e2-0434-46c6-ba9e-33622d4d5cbf","Type":"ContainerStarted","Data":"8cd984c740c34badb780e64f849955677d311b7300000ba185610d4ffa3f9a66"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.951344 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"fc89f23ae95a0a4526077950384af7148949bb2a8be42beaf602ae7350df2d54"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.958882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"5143db84c6592055339523f957d8a21c8537d548fe1cf469c26896d35c317321"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.959990 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lcg4" event={"ID":"6846d54c-4d22-46c7-b017-947a3986d773","Type":"ContainerStarted","Data":"9bbe669b2438e6fb16b2bad0cd53209261ddc8318eb8f31e08a3909a97e29905"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.964534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" event={"ID":"010c6a41-8e2d-4391-ac1b-82814dad98a4","Type":"ContainerStarted","Data":"4646d8a82d02b503bb83315a976355ab19911d4ff25a7bb4a4d8efbfd2c3e181"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.974924 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.975488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerStarted","Data":"8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.976276 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podStartSLOduration=160.976253438 podStartE2EDuration="2m40.976253438s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:06.97459083 +0000 UTC m=+216.232130384" watchObservedRunningTime="2026-02-27 00:09:06.976253438 +0000 UTC m=+216.233793002" Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.994102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2zw27" event={"ID":"55d8ebfe-a683-40f4-a3ef-bbeadb78ced7","Type":"ContainerStarted","Data":"6f94b6cdd9853273e3a3490b567a3e1c2ed77f91dacc937621e1f8ea1b8ce8a9"} Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.995267 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.997706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc"] Feb 27 00:09:06 crc kubenswrapper[4781]: I0227 00:09:06.999180 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerStarted","Data":"0b9131f6200bec26d0bbf0b742bbd481abca74e62ee0d568341df4edfb18e5df"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.009119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl77b" event={"ID":"38d33d08-97ce-49cb-b200-8ee30fc09e77","Type":"ContainerStarted","Data":"7a08987355db71a575fda8eac26f159da3be9b527fb24f30a4e536f887e83e3d"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.012333 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-2zw27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.012398 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podUID="55d8ebfe-a683-40f4-a3ef-bbeadb78ced7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.020742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"a3b992a0fb4c2108c27aba88e7ef166a11d537392232af50898e2980eec06e23"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.024229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.026911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.031739 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.531717464 +0000 UTC m=+216.789257198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.034097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-f4jxd" podStartSLOduration=161.034072118 podStartE2EDuration="2m41.034072118s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.020951376 +0000 UTC m=+216.278490930" watchObservedRunningTime="2026-02-27 00:09:07.034072118 +0000 UTC m=+216.291611672" Feb 27 00:09:07 crc kubenswrapper[4781]: W0227 00:09:07.052829 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf626ec7_00c1_4ea9_9e8a_1e4a2b66431b.slice/crio-8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a WatchSource:0}: Error finding container 8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a: Status 404 returned error can't find the container with id 8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.053823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"3fba6bbce7c2a12bf2c34025966ebefb7bc8e717b07803c5d4e37ed053b6d787"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.068944 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vhsds" podStartSLOduration=161.068921609 podStartE2EDuration="2m41.068921609s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.05459881 +0000 UTC m=+216.312138364" watchObservedRunningTime="2026-02-27 00:09:07.068921609 +0000 UTC m=+216.326461163" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.113234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" event={"ID":"6497cf4e-c461-4db9-88e4-5de2a5f28404","Type":"ContainerStarted","Data":"664a75d7c7ee02d01dd0a52c2dad68c164ab167d31654e4a25c1a4eb7af6eeb3"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.115228 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.129267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.130472 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.630450554 +0000 UTC m=+216.887990108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.154944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" event={"ID":"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62","Type":"ContainerStarted","Data":"83dc0b93862e776c3facc82b45fa63497d5f2c98362fa8f812d593bf8c5410d3"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.158516 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sw7s5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.158591 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" podUID="6497cf4e-c461-4db9-88e4-5de2a5f28404" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.182755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"b2453b51fad73b4748df174667c74d32b7b0d1789503b3ac2aa68e83deb0364a"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.214907 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.215412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" event={"ID":"ae09caff-6233-41f8-bb7d-a2314363e2fa","Type":"ContainerStarted","Data":"52375f28f61199965177d7e1dae2db2484713d90c6d03abe75862f9141503421"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.216118 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.219326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29535840-t9tlz"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.221959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.228170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"11a994d551e39f6d057d4851d8e1bebcd7f8950398cf71e37c1a5c0f55973eeb"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.231412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.231657 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8qh8"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.232017 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vtsxv" podStartSLOduration=161.232006679 podStartE2EDuration="2m41.232006679s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.175761266 +0000 UTC m=+216.433300830" watchObservedRunningTime="2026-02-27 00:09:07.232006679 +0000 UTC m=+216.489546233" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.233291 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.733275518 +0000 UTC m=+216.990815152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.234017 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.249957 4781 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mmx87 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.250322 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" podUID="ae09caff-6233-41f8-bb7d-a2314363e2fa" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.252179 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kxcrw" podStartSLOduration=161.252169942 podStartE2EDuration="2m41.252169942s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.249227395 +0000 UTC m=+216.506766949" watchObservedRunningTime="2026-02-27 00:09:07.252169942 +0000 UTC m=+216.509709496" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.253397 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-574r8"] Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.253431 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"befb4f7692e3d74436fee0560fd38addd6635f19d53ef755b808ff07a5fca84a"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.258192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" event={"ID":"8ce2e42f-cc3e-4bd1-a61b-4267a8f9c237","Type":"ContainerStarted","Data":"fdcf36e8f25a660de079c42743efc1da2e0c2077dacdf20ab9116c4c2a9cdaab"} Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259055 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-swgz7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259095 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259321 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wgpv7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.259337 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.260183 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.260217 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:07 crc kubenswrapper[4781]: W0227 00:09:07.273728 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd798400_ea88_4aad_ae19_815b6b8d57da.slice/crio-ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21 WatchSource:0}: Error finding container ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21: Status 404 returned error can't find the container with id ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21 Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.332611 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.332960 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.832930759 +0000 UTC m=+217.090470313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.339462 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.340269 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.840234647 +0000 UTC m=+217.097774201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.340288 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g2dgp" podStartSLOduration=161.340261928 podStartE2EDuration="2m41.340261928s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.303557554 +0000 UTC m=+216.561097108" watchObservedRunningTime="2026-02-27 00:09:07.340261928 +0000 UTC m=+216.597801482" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.397006 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rw9ls" podStartSLOduration=162.396987012 podStartE2EDuration="2m42.396987012s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.331923536 +0000 UTC m=+216.589463100" watchObservedRunningTime="2026-02-27 00:09:07.396987012 +0000 UTC m=+216.654526566" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.398562 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podStartSLOduration=161.398555488 podStartE2EDuration="2m41.398555488s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.39557328 +0000 UTC m=+216.653112834" watchObservedRunningTime="2026-02-27 00:09:07.398555488 +0000 UTC m=+216.656095042" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.449088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.449393 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:07.949378607 +0000 UTC m=+217.206918161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.484149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lz752" podStartSLOduration=161.482767755 podStartE2EDuration="2m41.482767755s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.47907289 +0000 UTC m=+216.736612444" watchObservedRunningTime="2026-02-27 00:09:07.482767755 +0000 UTC m=+216.740307319" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.538583 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podStartSLOduration=161.538567527 podStartE2EDuration="2m41.538567527s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.538282831 +0000 UTC m=+216.795822385" watchObservedRunningTime="2026-02-27 00:09:07.538567527 +0000 UTC m=+216.796107081" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.540319 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qjwrj" podStartSLOduration=161.540311928 podStartE2EDuration="2m41.540311928s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.502814776 +0000 UTC m=+216.760354350" watchObservedRunningTime="2026-02-27 00:09:07.540311928 +0000 UTC m=+216.797851482" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.552276 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.552578 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.052567029 +0000 UTC m=+217.310106583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.622030 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6w28d" podStartSLOduration=161.622014196 podStartE2EDuration="2m41.622014196s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.583825358 +0000 UTC m=+216.841364912" watchObservedRunningTime="2026-02-27 00:09:07.622014196 +0000 UTC m=+216.879553750" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.659290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.660507 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.160492631 +0000 UTC m=+217.418032185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.661466 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" podStartSLOduration=161.661445293 podStartE2EDuration="2m41.661445293s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.660282326 +0000 UTC m=+216.917821890" watchObservedRunningTime="2026-02-27 00:09:07.661445293 +0000 UTC m=+216.918984847" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.665578 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" podStartSLOduration=161.665562427 podStartE2EDuration="2m41.665562427s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.621214818 +0000 UTC m=+216.878754372" watchObservedRunningTime="2026-02-27 00:09:07.665562427 +0000 UTC m=+216.923101981" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.666577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.667423 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.16739254 +0000 UTC m=+217.424932094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.697393 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36582: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.702141 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8d9mv" podStartSLOduration=161.702123958 podStartE2EDuration="2m41.702123958s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.701024053 +0000 UTC m=+216.958563627" watchObservedRunningTime="2026-02-27 00:09:07.702123958 +0000 UTC m=+216.959663512" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.736267 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" podStartSLOduration=161.736251533 podStartE2EDuration="2m41.736251533s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:07.732438245 +0000 UTC m=+216.989977799" watchObservedRunningTime="2026-02-27 00:09:07.736251533 +0000 UTC m=+216.993791087" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.768099 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.768467 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.268450543 +0000 UTC m=+217.525990087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.804499 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36590: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.870304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.870730 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.370710644 +0000 UTC m=+217.628250198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.899843 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36600: no serving certificate available for the kubelet" Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.971514 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.971900 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.471789959 +0000 UTC m=+217.729329503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.972063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:07 crc kubenswrapper[4781]: E0227 00:09:07.974950 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.474927181 +0000 UTC m=+217.732466735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:07 crc kubenswrapper[4781]: I0227 00:09:07.987734 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36610: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.079758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.080097 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.580081198 +0000 UTC m=+217.837620752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.084004 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36624: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.182410 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.184114 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.6840993 +0000 UTC m=+217.941638854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.260331 4781 ???:1] "http: TLS handshake error from 192.168.126.11:36634: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.272948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lcg4" event={"ID":"6846d54c-4d22-46c7-b017-947a3986d773","Type":"ContainerStarted","Data":"3ec609bc241485334906f05e040e09ce986640ade2b04f4e43a37b8fd22b2fc8"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.286809 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" event={"ID":"bf32f77b-92ad-479d-8ee3-423f16089eb6","Type":"ContainerStarted","Data":"5224c25a7e621a9e9012e4e11e4ee9613e1854fc7844ad721ae3070b7213daec"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.286853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" event={"ID":"bf32f77b-92ad-479d-8ee3-423f16089eb6","Type":"ContainerStarted","Data":"8ceccf018021f28a76b1767c0a451cf32f6d8967a8d2e2f452ef8818c044517d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.287314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.287605 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.78759109 +0000 UTC m=+218.045130644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.288796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"8fcf8e9b27758ab1e005f19315672289f24b12042f2722b628ae3a3a01a876a4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.307874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"32c4427ac5091334a11909736774f7bd80659ddd13e8cd48c20b8b1c1e4cbe4d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.309304 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8lcg4" podStartSLOduration=162.309292559 podStartE2EDuration="2m42.309292559s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.307537008 +0000 UTC m=+217.565076562" watchObservedRunningTime="2026-02-27 00:09:08.309292559 +0000 UTC m=+217.566832113" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.324609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" event={"ID":"d9ce11ed-3022-47e0-8150-8af94af65076","Type":"ContainerStarted","Data":"59fc59679a172212b6e6db1643066e055ceeb6fccb9ef1c0f52b61b5a2e331b4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.333109 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerID="ad4696dc33a53d001e1c01038e280aa12b2d5bb8553786ed2d43c27cd13a084d" exitCode=0 Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.333371 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerDied","Data":"ad4696dc33a53d001e1c01038e280aa12b2d5bb8553786ed2d43c27cd13a084d"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.347304 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9z8qr" podStartSLOduration=163.347286592 podStartE2EDuration="2m43.347286592s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.343793672 +0000 UTC m=+217.601333226" watchObservedRunningTime="2026-02-27 00:09:08.347286592 +0000 UTC m=+217.604826146" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.360883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-574r8" event={"ID":"bd798400-ea88-4aad-ae19-815b6b8d57da","Type":"ContainerStarted","Data":"21771b43c514592a65ad5552e1e929b9c5b7df3a909f999160490d3099d2a8da"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.360937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-574r8" event={"ID":"bd798400-ea88-4aad-ae19-815b6b8d57da","Type":"ContainerStarted","Data":"ddc28522b63ffd34c2e34573e44510bb7bf3c28d3b9016bc7f9f73b81b7cca21"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.363829 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.366061 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.366124 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.379393 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" event={"ID":"fcbc0ec1-a7de-4cf4-833b-b586f3d6ec62","Type":"ContainerStarted","Data":"9929fa21f47069f64f78d6c0d0314ca1c08d1dad35bf6dc9830c9601924c0444"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.392207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.392653 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.892619995 +0000 UTC m=+218.150159549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.393754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" event={"ID":"16f4a859-d834-408d-9a9c-4d293b47d95a","Type":"ContainerStarted","Data":"5ab3a56e8c7b0d7634469a7ff6fbbb6939fb6353224b40e1746b82a0d6807700"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.393815 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" event={"ID":"16f4a859-d834-408d-9a9c-4d293b47d95a","Type":"ContainerStarted","Data":"823a400d2793709daba559c6a9b76da7307e5273d5a5e1dc43fd258e3e48c0b9"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.432853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" event={"ID":"ac4a870d-8cda-423b-a15b-391830c944f4","Type":"ContainerStarted","Data":"06fc308c716d99e0b9739a3e92c8410f2d714a0cf2a58882f1bef1fe37557956"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.456381 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" podStartSLOduration=163.45636534 podStartE2EDuration="2m43.45636534s" podCreationTimestamp="2026-02-27 00:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.408124911 +0000 UTC m=+217.665664475" watchObservedRunningTime="2026-02-27 00:09:08.45636534 +0000 UTC m=+217.713904894" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" event={"ID":"1234fab4-2533-4255-bdc2-dd1c3d3d61b5","Type":"ContainerStarted","Data":"02fcc16946420d5ff5300df02cb41acee13fa00816c45b9fd03ae7735368202e"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" event={"ID":"1234fab4-2533-4255-bdc2-dd1c3d3d61b5","Type":"ContainerStarted","Data":"19ec0c61709e31510f36979c4154d8a604c364f2f1e4a73a25e64466646bfcde"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.469493 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.481881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerStarted","Data":"34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.481926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerStarted","Data":"02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.497811 4781 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7vd5x container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498320 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" podUID="1234fab4-2533-4255-bdc2-dd1c3d3d61b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498173 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.498234 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:08.998218492 +0000 UTC m=+218.255758046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.498905 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.501892 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.001877237 +0000 UTC m=+218.259416781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.505022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerStarted","Data":"73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.521003 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sl77b" event={"ID":"38d33d08-97ce-49cb-b200-8ee30fc09e77","Type":"ContainerStarted","Data":"42f8def94e6d3c12464c6b3feeae43ac20bea1ebb7ffb0eeeb0d126d2bc13cb0"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.523453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" event={"ID":"b9dadb6a-e49e-4473-8338-3af567aacb4a","Type":"ContainerStarted","Data":"ac40adffdd127da39ae2507907fb38373b6d69fa918c3ddcfaee6d0e52368b01"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.581513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"acde24607c7b0fb059c97edcf650a34ce01281254111d33a16097e2c6937e171"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.581561 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" event={"ID":"5cbee45f-1bdf-44e9-9782-83340ea69870","Type":"ContainerStarted","Data":"b975438d9bf0effc0f383ed3409cf91c5425e96642be5f7f94a4b6b6f475b948"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.582130 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.606082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerStarted","Data":"898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.608039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.609256 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.109242885 +0000 UTC m=+218.366782439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.616985 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56562: no serving certificate available for the kubelet" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.635297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" event={"ID":"3f3571fd-ce1b-4105-9100-020fd1cd5076","Type":"ContainerStarted","Data":"a6cfbebe704fb1381bfafda6527c42b651059ad2455446c3086177fe8be79344"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637079 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerStarted","Data":"126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerStarted","Data":"8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.637541 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" event={"ID":"77c54f3f-bdb8-42ff-a466-3bfb1e2d9464","Type":"ContainerStarted","Data":"63cb1f3b771058d589224d8fb22198abea8226ee472d22ccf972deff4e404cc9"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675586 4781 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2zhrk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.675662 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": dial tcp 10.217.0.20:6443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.676765 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-574r8" podStartSLOduration=7.6767532769999995 podStartE2EDuration="7.676753277s" podCreationTimestamp="2026-02-27 00:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.676098402 +0000 UTC m=+217.933637956" watchObservedRunningTime="2026-02-27 00:09:08.676753277 +0000 UTC m=+217.934292821" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.678764 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ksvtc" podStartSLOduration=162.678756854 podStartE2EDuration="2m42.678756854s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.582336717 +0000 UTC m=+217.839876271" watchObservedRunningTime="2026-02-27 00:09:08.678756854 +0000 UTC m=+217.936296398" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.703154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"ccd51fda8823b76ef5845848f8d0690dfeeabdb21391d0115db34185a41b1b97"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.703211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" event={"ID":"396c6e41-89e8-4ecf-ac96-f73aad1f4bbb","Type":"ContainerStarted","Data":"3a16f1a6ffa2d016c4c94897233d60791a1997b713a1593909bcc29419e196c4"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.706102 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.710443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.711175 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.711220 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.711543 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.211527847 +0000 UTC m=+218.469067491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.746747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rhhqx" event={"ID":"98d3eede-8852-4bf5-a905-25974e47445f","Type":"ContainerStarted","Data":"c3c25fbc57e5d781df69b1687c08ccf96eb500c411327096c0537e00a427f258"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.753779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" event={"ID":"9d51c244-aac1-41de-adc4-2393a45392f1","Type":"ContainerStarted","Data":"3695a4c6299a6ec731f058159a25a778df05ce7dbe7bc7d51aa249ce1349c630"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.767819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"c68ec309647256b6cd7991fa50c2092e7afd1452b00f47baaff4d838af7fd462"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.768050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"c0e1c18fc197bd478ed251faf34738158576ad1a668568d4158f01011e81ffdb"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"07cbadf40a6f8fc1edfbfb6dfbab48131215d5e9beb738042ced3279ca75fad1"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" event={"ID":"14db9d97-7da5-43c2-8d48-fb435f1a19d0","Type":"ContainerStarted","Data":"e2e4c66c8d45404b36d849e8a515fd4de6c19674d5072fa80ab2cee64f2a7f7f"} Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803727 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-2zw27 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.803784 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2zw27" podUID="55d8ebfe-a683-40f4-a3ef-bbeadb78ced7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.813125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.814445 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.314424443 +0000 UTC m=+218.571963997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.855939 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sp4hz" podStartSLOduration=162.855906617 podStartE2EDuration="2m42.855906617s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.852822366 +0000 UTC m=+218.110361930" watchObservedRunningTime="2026-02-27 00:09:08.855906617 +0000 UTC m=+218.113446161" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.856927 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cjjxc" podStartSLOduration=162.8569225 podStartE2EDuration="2m42.8569225s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.758704652 +0000 UTC m=+218.016244206" watchObservedRunningTime="2026-02-27 00:09:08.8569225 +0000 UTC m=+218.114462054" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.884339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmx87" Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.915189 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:08 crc kubenswrapper[4781]: E0227 00:09:08.916220 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.416208073 +0000 UTC m=+218.673747627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:08 crc kubenswrapper[4781]: I0227 00:09:08.928736 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29535840-t9tlz" podStartSLOduration=162.928718161 podStartE2EDuration="2m42.928718161s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:08.927842611 +0000 UTC m=+218.185382165" watchObservedRunningTime="2026-02-27 00:09:08.928718161 +0000 UTC m=+218.186257715" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.018027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.018512 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.518495885 +0000 UTC m=+218.776035439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.060559 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmmt9" podStartSLOduration=163.060540082 podStartE2EDuration="2m43.060540082s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.009050468 +0000 UTC m=+218.266590022" watchObservedRunningTime="2026-02-27 00:09:09.060540082 +0000 UTC m=+218.318079636" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.076247 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.076516 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.090737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.091306 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.097992 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56568: no serving certificate available for the kubelet" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.126186 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" podStartSLOduration=163.126171801 podStartE2EDuration="2m43.126171801s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.125324801 +0000 UTC m=+218.382864355" watchObservedRunningTime="2026-02-27 00:09:09.126171801 +0000 UTC m=+218.383711355" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.127342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.127826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.627810338 +0000 UTC m=+218.885349892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.128443 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cfts2" podStartSLOduration=163.128431393 podStartE2EDuration="2m43.128431393s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.062495167 +0000 UTC m=+218.320034721" watchObservedRunningTime="2026-02-27 00:09:09.128431393 +0000 UTC m=+218.385970977" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.196704 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2xvkz" podStartSLOduration=163.196687652 podStartE2EDuration="2m43.196687652s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.19443161 +0000 UTC m=+218.451971164" watchObservedRunningTime="2026-02-27 00:09:09.196687652 +0000 UTC m=+218.454227206" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.207842 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sw7s5" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.228456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.228885 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.728870272 +0000 UTC m=+218.986409826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.305367 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-29z97" podStartSLOduration=163.30533323 podStartE2EDuration="2m43.30533323s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.245133356 +0000 UTC m=+218.502672900" watchObservedRunningTime="2026-02-27 00:09:09.30533323 +0000 UTC m=+218.562872794" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.330412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.330771 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.830759315 +0000 UTC m=+219.088298869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.367375 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-72rjz" podStartSLOduration=163.367357356 podStartE2EDuration="2m43.367357356s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.318868101 +0000 UTC m=+218.576407645" watchObservedRunningTime="2026-02-27 00:09:09.367357356 +0000 UTC m=+218.624896910" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.418956 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:09 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:09 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:09 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.419043 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.431819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.432231 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:09.932214598 +0000 UTC m=+219.189754152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.439643 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sl77b" podStartSLOduration=7.439612868 podStartE2EDuration="7.439612868s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.375464163 +0000 UTC m=+218.633003717" watchObservedRunningTime="2026-02-27 00:09:09.439612868 +0000 UTC m=+218.697152422" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.440069 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podStartSLOduration=163.440064898 podStartE2EDuration="2m43.440064898s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.437356516 +0000 UTC m=+218.694896070" watchObservedRunningTime="2026-02-27 00:09:09.440064898 +0000 UTC m=+218.697604462" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.522398 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tnl79" podStartSLOduration=163.522380511 podStartE2EDuration="2m43.522380511s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.521991862 +0000 UTC m=+218.779531416" watchObservedRunningTime="2026-02-27 00:09:09.522380511 +0000 UTC m=+218.779920065" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.533645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.534521 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.03450869 +0000 UTC m=+219.292048244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.554040 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" podStartSLOduration=163.554009148 podStartE2EDuration="2m43.554009148s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.551859349 +0000 UTC m=+218.809398903" watchObservedRunningTime="2026-02-27 00:09:09.554009148 +0000 UTC m=+218.811548692" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.623424 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" podStartSLOduration=163.623386713 podStartE2EDuration="2m43.623386713s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.615086072 +0000 UTC m=+218.872625636" watchObservedRunningTime="2026-02-27 00:09:09.623386713 +0000 UTC m=+218.880926267" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.634668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.634944 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.134899478 +0000 UTC m=+219.392439032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.635352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.635717 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.135700376 +0000 UTC m=+219.393239930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.711480 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" podStartSLOduration=163.711464268 podStartE2EDuration="2m43.711464268s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:09.710642569 +0000 UTC m=+218.968182123" watchObservedRunningTime="2026-02-27 00:09:09.711464268 +0000 UTC m=+218.969003822" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.737000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.737320 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.237305402 +0000 UTC m=+219.494844956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.834374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" event={"ID":"e1c9b213-8c36-4ecf-831f-69a912f6364f","Type":"ContainerStarted","Data":"af4073fdeecca7a7aa604ec924d63a357759f8c158c1a298f15e9d389ac98486"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.834478 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.839449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.839763 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.339750087 +0000 UTC m=+219.597289631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.846808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"d792e54137f4bdb958e476f08a383626f9798d849f05c35836509bfd2561a429"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.846852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" event={"ID":"7822bd5e-93d1-4f1e-961c-ec0c8a04ab59","Type":"ContainerStarted","Data":"8d62d374d868463d06a572a2831fb88e13d49e7e6c27c1eea169b4b6fd868e01"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.848844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" event={"ID":"d5b604c3-aa52-42f3-8922-8edee056f016","Type":"ContainerStarted","Data":"226ab1af46a96d5e2d3021413e6b3e5e048d3912a0c3b00819349e01a3736a07"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.858597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8qh8" event={"ID":"bde9c2fa-aa41-445a-bfb3-eecde86f5ce5","Type":"ContainerStarted","Data":"eca78ef0786186f4f67e7ce7e0815c48a2dae136d8020f874ef92b453160d9c0"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.859269 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.869981 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56574: no serving certificate available for the kubelet" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.873915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"6b0aab158a544e1afdf453eadcba2b38defe3314e343d756ffb848145161f206"} Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.884066 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.884116 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.921863 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7vd5x" Feb 27 00:09:09 crc kubenswrapper[4781]: I0227 00:09:09.940521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:09 crc kubenswrapper[4781]: E0227 00:09:09.942120 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.44210465 +0000 UTC m=+219.699644204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.042804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.047796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.54778352 +0000 UTC m=+219.805323074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.135815 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podStartSLOduration=164.135784324 podStartE2EDuration="2m44.135784324s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.032498689 +0000 UTC m=+219.290038243" watchObservedRunningTime="2026-02-27 00:09:10.135784324 +0000 UTC m=+219.393323888" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.144191 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.144408 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.644388551 +0000 UTC m=+219.901928105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.144533 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.144864 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.644856562 +0000 UTC m=+219.902396116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.238733 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6rw4v" podStartSLOduration=164.23871337 podStartE2EDuration="2m44.23871337s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.190120823 +0000 UTC m=+219.447660387" watchObservedRunningTime="2026-02-27 00:09:10.23871337 +0000 UTC m=+219.496252924" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.249203 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.249525 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.749509988 +0000 UTC m=+220.007049542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.318731 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k8qh8" podStartSLOduration=8.31870956 podStartE2EDuration="8.31870956s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.24130413 +0000 UTC m=+219.498843684" watchObservedRunningTime="2026-02-27 00:09:10.31870956 +0000 UTC m=+219.576249124" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.350324 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.350696 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.850684485 +0000 UTC m=+220.108224039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.366615 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:10 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:10 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:10 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.366680 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.451949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.452375 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:10.952358712 +0000 UTC m=+220.209898266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.553345 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.553695 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.053679702 +0000 UTC m=+220.311219256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.566605 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d9gmh" podStartSLOduration=164.566585229 podStartE2EDuration="2m44.566585229s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:10.322387464 +0000 UTC m=+219.579927028" watchObservedRunningTime="2026-02-27 00:09:10.566585229 +0000 UTC m=+219.824124783" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.569324 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.569516 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" containerID="cri-o://7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" gracePeriod=30 Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.601945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.655052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.655586 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.155563965 +0000 UTC m=+220.413103519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.661283 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.663081 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2zw27" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.665866 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.666072 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" containerID="cri-o://1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" gracePeriod=30 Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.681238 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.758387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.758733 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.258720546 +0000 UTC m=+220.516260090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.864590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.865163 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.365140303 +0000 UTC m=+220.622679847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.874048 4781 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2zhrk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.874117 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.20:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.912106 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ht6qt" Feb 27 00:09:10 crc kubenswrapper[4781]: I0227 00:09:10.967687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:10 crc kubenswrapper[4781]: E0227 00:09:10.971323 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.471305294 +0000 UTC m=+220.728844848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.071847 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.071964 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.571948448 +0000 UTC m=+220.829488002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.072361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.072659 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.572651624 +0000 UTC m=+220.830191178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.089820 4781 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cr2bb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]log ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]etcd ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/start-apiserver-admission-initializer failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/generic-apiserver-start-informers ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/max-in-flight-filter failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]poststarthook/project.openshift.io-projectcache ok Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/project.openshift.io-projectauthorizationcache failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]poststarthook/quota.openshift.io-clusterquotamapping failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: livez check failed Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.089896 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" podUID="d9ce11ed-3022-47e0-8150-8af94af65076" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.173601 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.173999 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.673984464 +0000 UTC m=+220.931524018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.191188 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.210514 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56584: no serving certificate available for the kubelet" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.275162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.275716 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.775698513 +0000 UTC m=+221.033238067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.367842 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:11 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:11 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.367898 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.376363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.376591 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.876546742 +0000 UTC m=+221.134086296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.376789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.377084 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.877070524 +0000 UTC m=+221.134610078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.475905 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.478137 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.478435 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:11.978417174 +0000 UTC m=+221.235956728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.479987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.480029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.493310 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.559679 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.582612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.582896 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.082883716 +0000 UTC m=+221.340423260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663180 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663202 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.663659 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663680 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.663699 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663706 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663811 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerName="controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.663833 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24423db-53f2-4555-81e4-228b3911e144" containerName="route-controller-manager" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.664525 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.670113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.673122 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683605 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.683823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") pod \"c7332c18-9748-49d2-b512-a46c2d1fcb79\" (UID: \"c7332c18-9748-49d2-b512-a46c2d1fcb79\") " Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.683957 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.183932609 +0000 UTC m=+221.441472163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684240 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.684296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.684585 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.184578204 +0000 UTC m=+221.442117758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685285 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config" (OuterVolumeSpecName: "config") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.685829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.686240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.686296 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.693897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb" (OuterVolumeSpecName: "kube-api-access-tnbwb") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "kube-api-access-tnbwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.694027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7332c18-9748-49d2-b512-a46c2d1fcb79" (UID: "c7332c18-9748-49d2-b512-a46c2d1fcb79"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.712418 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"community-operators-42hbx\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.762216 4781 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") pod \"a24423db-53f2-4555-81e4-228b3911e144\" (UID: \"a24423db-53f2-4555-81e4-228b3911e144\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.785894 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786227 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786409 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786422 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnbwb\" (UniqueName: \"kubernetes.io/projected/c7332c18-9748-49d2-b512-a46c2d1fcb79-kube-api-access-tnbwb\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786432 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786443 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7332c18-9748-49d2-b512-a46c2d1fcb79-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786454 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7332c18-9748-49d2-b512-a46c2d1fcb79-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.786713 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.286668661 +0000 UTC m=+221.544208215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786801 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config" (OuterVolumeSpecName: "config") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.786949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca" (OuterVolumeSpecName: "client-ca") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.790951 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.811752 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p" (OuterVolumeSpecName: "kube-api-access-xrg6p") pod "a24423db-53f2-4555-81e4-228b3911e144" (UID: "a24423db-53f2-4555-81e4-228b3911e144"). InnerVolumeSpecName "kube-api-access-xrg6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.846007 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.864845 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.865757 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.888553 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.888578 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.890162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.890671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.891969 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.391946152 +0000 UTC m=+221.649485706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.889542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892781 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892797 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a24423db-53f2-4555-81e4-228b3911e144-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892811 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24423db-53f2-4555-81e4-228b3911e144-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.892825 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrg6p\" (UniqueName: \"kubernetes.io/projected/a24423db-53f2-4555-81e4-228b3911e144-kube-api-access-xrg6p\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910452 4781 generic.go:334] "Generic (PLEG): container finished" podID="a24423db-53f2-4555-81e4-228b3911e144" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" exitCode=0 Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910515 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerDied","Data":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" event={"ID":"a24423db-53f2-4555-81e4-228b3911e144","Type":"ContainerDied","Data":"e2d52d381f5f2c2aa6e3b3529d449b9cd90d4bab2b2b6374496041f06b7f95d6"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910561 4781 scope.go:117] "RemoveContainer" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.910748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.916015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"certified-operators-kztqg\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.916983 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"e131d45bbf0a76e83c9db42c296a6f6c98df038ffcda0bb488d5e7953b3020f1"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.917033 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"99370a1ca3aba85234b46ccb3551132e01f233f017156fc02713ad284ef2946a"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918489 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7332c18-9748-49d2-b512-a46c2d1fcb79" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" exitCode=0 Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918546 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerDied","Data":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.918591 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ktjdc" event={"ID":"c7332c18-9748-49d2-b512-a46c2d1fcb79","Type":"ContainerDied","Data":"569469b156a3d6f73fda1c00c629b8cfcf29a4662b4eccaa3dcb213bb4a0f1d1"} Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.942688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.945211 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-swgz7"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.947822 4781 scope.go:117] "RemoveContainer" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.948213 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": container with ID starting with 1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3 not found: ID does not exist" containerID="1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.948245 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3"} err="failed to get container status \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": rpc error: code = NotFound desc = could not find container \"1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3\": container with ID starting with 1ae6cc805e344ef4517bcbce76398b0c961c1da8486951790b49bb21c9e7a7c3 not found: ID does not exist" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.948264 4781 scope.go:117] "RemoveContainer" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.960052 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.963015 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.964760 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ktjdc"] Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.990899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.994268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.994536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: E0227 00:09:11.994638 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.494566251 +0000 UTC m=+221.752105805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.995519 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:11 crc kubenswrapper[4781]: I0227 00:09:11.995893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.006839 4781 scope.go:117] "RemoveContainer" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.007518 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": container with ID starting with 7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710 not found: ID does not exist" containerID="7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.007557 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710"} err="failed to get container status \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": rpc error: code = NotFound desc = could not find container \"7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710\": container with ID starting with 7953132f6160b5cf17723ae05d8c6903d6203982009ce6fad05bc88ee99ff710 not found: ID does not exist" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.049927 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.050945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.089748 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.106847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.107206 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.108010 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.108614 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.608601873 +0000 UTC m=+221.866141427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.112592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.137851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"community-operators-kqrgb\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.192093 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.209822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.210109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.210208 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.710192669 +0000 UTC m=+221.967732223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.310964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.311927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.312227 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.812212395 +0000 UTC m=+222.069751949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.320515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.331574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.343775 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"certified-operators-52xgq\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.360231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.372281 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:12 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:12 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:12 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.372323 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:12 crc kubenswrapper[4781]: W0227 00:09:12.388229 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b050e9e_d6c8_4e27_ad3f_9681553c1539.slice/crio-f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6 WatchSource:0}: Error finding container f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6: Status 404 returned error can't find the container with id f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.412686 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.413127 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 00:09:12.913110745 +0000 UTC m=+222.170650289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.441199 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.463711 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.464673 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.469989 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.470775 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472413 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472784 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.472903 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473222 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.473685 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.474866 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.474919 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475358 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475491 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.475882 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.476749 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.480712 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.482159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.483770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.514265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: E0227 00:09:12.514670 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 00:09:13.01465467 +0000 UTC m=+222.272194224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tw95c" (UID: "16339491-baee-42b5-82bb-07bca82a5f77") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.530611 4781 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-27T00:09:11.76226175Z","Handler":null,"Name":""} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.533950 4781 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.533979 4781 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625414 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625471 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625496 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.625572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.636292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729407 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.729584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.730492 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.731707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.732213 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.734421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.736876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738219 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738244 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738311 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.738342 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.747664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"controller-manager-774845979b-t9755\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.751084 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"route-controller-manager-8d8c487b-4kknf\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.753214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.763139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tw95c\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.788453 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.815853 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.854852 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.897176 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.897234 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928392 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9" exitCode=0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.928728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerStarted","Data":"ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936525 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" exitCode=0 Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.936793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerStarted","Data":"78b3df3f6b7f7425a9c2cd10f5b420e9f36ecb616bd533d5cfdfee3767475ccc"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.957349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" event={"ID":"26e75b38-be64-4f34-933f-731abfe217b6","Type":"ContainerStarted","Data":"557bd1bd32a3bf797dc2d98115a973ca7f23c121046b9a168d3bbca91df7a6d2"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.960617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerStarted","Data":"dc9d59b8ab934cad32f1842b836646a3832e9408664f5c6c345f309f196516de"} Feb 27 00:09:12 crc kubenswrapper[4781]: I0227 00:09:12.985904 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wdgtd" podStartSLOduration=10.985890165 podStartE2EDuration="10.985890165s" podCreationTimestamp="2026-02-27 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:12.984867251 +0000 UTC m=+222.242406795" watchObservedRunningTime="2026-02-27 00:09:12.985890165 +0000 UTC m=+222.243429719" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003892 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" exitCode=0 Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd"} Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.003988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerStarted","Data":"f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6"} Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.116991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.158597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.229492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.265945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.316971 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.317691 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24423db-53f2-4555-81e4-228b3911e144" path="/var/lib/kubelet/pods/a24423db-53f2-4555-81e4-228b3911e144/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.318295 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7332c18-9748-49d2-b512-a46c2d1fcb79" path="/var/lib/kubelet/pods/c7332c18-9748-49d2-b512-a46c2d1fcb79/volumes" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.367206 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:13 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:13 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:13 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.367247 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.453083 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.454173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.455591 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.461715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.642811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.728406 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.729118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.730753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.731741 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.741987 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.774876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.807722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"redhat-marketplace-9ngbg\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.831967 4781 ???:1] "http: TLS handshake error from 192.168.126.11:56592: no serving certificate available for the kubelet" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.855975 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.857031 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.873904 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.875874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.875938 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.936673 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.937437 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.940030 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.940172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.944097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977278 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.977596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:13 crc kubenswrapper[4781]: I0227 00:09:13.997337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.011130 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" exitCode=0 Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.011185 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerStarted","Data":"2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerStarted","Data":"3d67897192f1eb6932753a86ce0f7bd6d344c09b54e771c58e76686037dd2268"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.014686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.029999 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.031781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerStarted","Data":"cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.031823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerStarted","Data":"8475af139220f4e889d3615e00c27c5c3f916ced71896c2698d7d1d5d2f40792"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.032085 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.034943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerStarted","Data":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.034976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerStarted","Data":"baa2ed7e45a407c61fcadf3b6fb1abb2bf58b2f1863ead5f5bd18f0e92393602"} Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.035025 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.039077 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.050730 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podStartSLOduration=4.050714036 podStartE2EDuration="4.050714036s" podCreationTimestamp="2026-02-27 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.048381703 +0000 UTC m=+223.305921257" watchObservedRunningTime="2026-02-27 00:09:14.050714036 +0000 UTC m=+223.308253590" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.076212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.077986 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079887 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.079944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.080399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.081508 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.082331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.093834 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cr2bb" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.104364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"redhat-marketplace-5rnj7\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.127502 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" podStartSLOduration=168.127463361 podStartE2EDuration="2m48.127463361s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.101868913 +0000 UTC m=+223.359408487" watchObservedRunningTime="2026-02-27 00:09:14.127463361 +0000 UTC m=+223.385002915" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.157210 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podStartSLOduration=4.157165554 podStartE2EDuration="4.157165554s" podCreationTimestamp="2026-02-27 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:14.150392398 +0000 UTC m=+223.407931952" watchObservedRunningTime="2026-02-27 00:09:14.157165554 +0000 UTC m=+223.414705108" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.177073 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.183880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.184046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.185090 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.209959 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.210012 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.211762 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.226945 4781 patch_prober.go:28] interesting pod/console-f9d7485db-vtsxv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.226999 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vtsxv" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.231470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.279050 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.372422 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:14 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:14 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:14 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.372829 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.670061 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.671036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.677356 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.743244 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.796194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.806593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: W0227 00:09:14.828567 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa593f3_06c4_461f_a893_609b07dfd282.slice/crio-9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5 WatchSource:0}: Error finding container 9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5: Status 404 returned error can't find the container with id 9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5 Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.847087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907467 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.907497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.908031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.908489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.951061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"redhat-operators-hcdz5\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955127 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955188 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955256 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:14 crc kubenswrapper[4781]: I0227 00:09:14.955294 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.003189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.048204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerStarted","Data":"b4c78b3d5964c2a730f268fed158cc29cd746663976e63644c0b8dcc232f4b12"} Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.065708 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerStarted","Data":"9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5"} Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.091522 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.107731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.107830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.132362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.172578 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.216841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.217281 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.217442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: W0227 00:09:15.235305 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9a90341e_86fb_4819_848b_cdd71b0ac0a7.slice/crio-7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b WatchSource:0}: Error finding container 7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b: Status 404 returned error can't find the container with id 7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.327781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.328555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.328587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.356557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"redhat-operators-dj7h5\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.370035 4781 patch_prober.go:28] interesting pod/router-default-5444994796-8lcg4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 00:09:15 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Feb 27 00:09:15 crc kubenswrapper[4781]: [+]process-running ok Feb 27 00:09:15 crc kubenswrapper[4781]: healthz check failed Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.370083 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lcg4" podUID="6846d54c-4d22-46c7-b017-947a3986d773" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.377495 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.536910 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.615310 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:09:15 crc kubenswrapper[4781]: I0227 00:09:15.923326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.082078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerStarted","Data":"c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.082563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerStarted","Data":"bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.088760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerStarted","Data":"4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.088804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerStarted","Data":"7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.093270 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.093345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097538 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.097876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"6d980a6fc9de180882f2ee8cc193af0d7ab5d1ba875bfb8da4f55cc14f767f69"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.108165 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.108148982 podStartE2EDuration="3.108148982s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:16.100591048 +0000 UTC m=+225.358130602" watchObservedRunningTime="2026-02-27 00:09:16.108148982 +0000 UTC m=+225.365688536" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.127830 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.127810744 podStartE2EDuration="3.127810744s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:16.127648551 +0000 UTC m=+225.385188105" watchObservedRunningTime="2026-02-27 00:09:16.127810744 +0000 UTC m=+225.385350298" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.135846 4781 generic.go:334] "Generic (PLEG): container finished" podID="678f27fc-d210-4a4f-bd73-090378740da9" containerID="898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.135882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerDied","Data":"898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.152808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"45d5d509e8ad0dc50e09ff3936cc7a26189c6c645b18672248f0a72722749ca4"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.162393 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" exitCode=0 Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.162900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91"} Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.373675 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:16 crc kubenswrapper[4781]: I0227 00:09:16.377098 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8lcg4" Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.187531 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7" exitCode=0 Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.187618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7"} Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.965701 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerID="c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b" exitCode=0 Feb 27 00:09:17 crc kubenswrapper[4781]: I0227 00:09:17.965780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerDied","Data":"c3a493a37405ad435f02cc17eedb2fb9132690911ac52fc757e13532a2d8192b"} Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.277413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.278543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:18 crc kubenswrapper[4781]: I0227 00:09:18.290985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.035644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.049930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.050331 4781 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fdkct container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.050368 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podUID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.062548 4781 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fdkct container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" start-of-body= Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.062853 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fdkct" podUID="e1c9b213-8c36-4ecf-831f-69a912f6364f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.156801 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.185842 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerID="4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c" exitCode=0 Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.217306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.234794 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.235448 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 00:09:21 crc kubenswrapper[4781]: E0227 00:09:21.236540 4781 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.269s" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.277120 4781 ???:1] "http: TLS handshake error from 192.168.126.11:58336: no serving certificate available for the kubelet" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.357660 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k8qh8" Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.357711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerDied","Data":"4d228af20df66a57dfbd572426eaea07b15759a2c69b3a41b9d87c2e34efb05c"} Feb 27 00:09:21 crc kubenswrapper[4781]: I0227 00:09:21.627791 4781 ???:1] "http: TLS handshake error from 192.168.126.11:58342: no serving certificate available for the kubelet" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.291105 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.296107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955051 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955130 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955227 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:24 crc kubenswrapper[4781]: I0227 00:09:24.955252 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.705925 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.706770 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" containerID="cri-o://cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" gracePeriod=30 Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.726279 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:29 crc kubenswrapper[4781]: I0227 00:09:29.726653 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" containerID="cri-o://2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" gracePeriod=30 Feb 27 00:09:31 crc kubenswrapper[4781]: I0227 00:09:31.561675 4781 ???:1] "http: TLS handshake error from 192.168.126.11:45998: no serving certificate available for the kubelet" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.789830 4781 patch_prober.go:28] interesting pod/controller-manager-774845979b-t9755 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.789946 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.816876 4781 patch_prober.go:28] interesting pod/route-controller-manager-8d8c487b-4kknf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.816925 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 27 00:09:32 crc kubenswrapper[4781]: I0227 00:09:32.861723 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:09:33 crc kubenswrapper[4781]: I0227 00:09:33.280923 4781 generic.go:334] "Generic (PLEG): container finished" podID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerID="cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" exitCode=0 Feb 27 00:09:33 crc kubenswrapper[4781]: I0227 00:09:33.281009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerDied","Data":"cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a"} Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.211916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.214714 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.232365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e866e388-01ab-407a-a59b-d0ba6c3f6f22-metrics-certs\") pod \"network-metrics-daemon-kpnjj\" (UID: \"e866e388-01ab-407a-a59b-d0ba6c3f6f22\") " pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.436643 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.445188 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpnjj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956130 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956173 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956237 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956253 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.956322 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957257 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} pod="openshift-console/downloads-7954f5f757-qjwrj" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957244 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957329 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" containerID="cri-o://a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96" gracePeriod=2 Feb 27 00:09:34 crc kubenswrapper[4781]: I0227 00:09:34.957678 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:39 crc kubenswrapper[4781]: I0227 00:09:39.318289 4781 generic.go:334] "Generic (PLEG): container finished" podID="c3667d98-cf94-4751-8191-1d924ea13617" containerID="2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" exitCode=0 Feb 27 00:09:39 crc kubenswrapper[4781]: I0227 00:09:39.318360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerDied","Data":"2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688"} Feb 27 00:09:40 crc kubenswrapper[4781]: I0227 00:09:40.326283 4781 generic.go:334] "Generic (PLEG): container finished" podID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerID="a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96" exitCode=0 Feb 27 00:09:40 crc kubenswrapper[4781]: I0227 00:09:40.326342 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerDied","Data":"a7c8f7063bafc919361346a9dc1315b92859e40920e9b450653ad64294dfaf96"} Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.341892 4781 generic.go:334] "Generic (PLEG): container finished" podID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerID="34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e" exitCode=0 Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.342091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerDied","Data":"34034ef1e924a05fbc92daf60e2f0c105f332a30b0fe9cea72b0da3d3065e13e"} Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.789579 4781 patch_prober.go:28] interesting pod/controller-manager-774845979b-t9755 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.789670 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774845979b-t9755" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.817184 4781 patch_prober.go:28] interesting pod/route-controller-manager-8d8c487b-4kknf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.817250 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.895587 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:09:42 crc kubenswrapper[4781]: I0227 00:09:42.895661 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.669214 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.676312 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.717450 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.722035 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.776952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") pod \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") pod \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\" (UID: \"9a90341e-86fb-4819-848b-cdd71b0ac0a7\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") pod \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") pod \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\" (UID: \"91e2c481-01ee-461f-bc5b-d09b7ea221c5\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777266 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") pod \"678f27fc-d210-4a4f-bd73-090378740da9\" (UID: \"678f27fc-d210-4a4f-bd73-090378740da9\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.777786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9a90341e-86fb-4819-848b-cdd71b0ac0a7" (UID: "9a90341e-86fb-4819-848b-cdd71b0ac0a7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.780097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.780514 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca" (OuterVolumeSpecName: "serviceca") pod "91e2c481-01ee-461f-bc5b-d09b7ea221c5" (UID: "91e2c481-01ee-461f-bc5b-d09b7ea221c5"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.787244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc" (OuterVolumeSpecName: "kube-api-access-95hgc") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "kube-api-access-95hgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.787358 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "678f27fc-d210-4a4f-bd73-090378740da9" (UID: "678f27fc-d210-4a4f-bd73-090378740da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.789181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl" (OuterVolumeSpecName: "kube-api-access-856kl") pod "91e2c481-01ee-461f-bc5b-d09b7ea221c5" (UID: "91e2c481-01ee-461f-bc5b-d09b7ea221c5"). InnerVolumeSpecName "kube-api-access-856kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.799949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9a90341e-86fb-4819-848b-cdd71b0ac0a7" (UID: "9a90341e-86fb-4819-848b-cdd71b0ac0a7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.879808 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") pod \"dd741f12-8908-4f25-a2ab-2a9deb826494\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.879898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") pod \"dd741f12-8908-4f25-a2ab-2a9deb826494\" (UID: \"dd741f12-8908-4f25-a2ab-2a9deb826494\") " Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880143 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/678f27fc-d210-4a4f-bd73-090378740da9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880154 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880165 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a90341e-86fb-4819-848b-cdd71b0ac0a7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880173 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e2c481-01ee-461f-bc5b-d09b7ea221c5-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880181 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-856kl\" (UniqueName: \"kubernetes.io/projected/91e2c481-01ee-461f-bc5b-d09b7ea221c5-kube-api-access-856kl\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880191 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95hgc\" (UniqueName: \"kubernetes.io/projected/678f27fc-d210-4a4f-bd73-090378740da9-kube-api-access-95hgc\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880199 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/678f27fc-d210-4a4f-bd73-090378740da9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.880238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd741f12-8908-4f25-a2ab-2a9deb826494" (UID: "dd741f12-8908-4f25-a2ab-2a9deb826494"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.884248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd741f12-8908-4f25-a2ab-2a9deb826494" (UID: "dd741f12-8908-4f25-a2ab-2a9deb826494"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.981330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd741f12-8908-4f25-a2ab-2a9deb826494-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:43 crc kubenswrapper[4781]: I0227 00:09:43.981360 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd741f12-8908-4f25-a2ab-2a9deb826494-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.357904 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" event={"ID":"678f27fc-d210-4a4f-bd73-090378740da9","Type":"ContainerDied","Data":"8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.358296 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e97fd8fcdef99a06975af07b11d983d49d1856c8a620f0853e184ef575d88e1" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.357958 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362148 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd741f12-8908-4f25-a2ab-2a9deb826494","Type":"ContainerDied","Data":"bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.362238 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba3576a0eb52065bd913ed89976d8f6d85c179f2826194a815d95093997aef7" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9a90341e-86fb-4819-848b-cdd71b0ac0a7","Type":"ContainerDied","Data":"7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364224 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbdbb00ad1bfd6f9360f74b3ba833fdc090e22888962d9a0ed6331a2064890b" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.364264 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29535840-t9tlz" event={"ID":"91e2c481-01ee-461f-bc5b-d09b7ea221c5","Type":"ContainerDied","Data":"02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773"} Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365665 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02350f41c01977124604e142f885201d5743582263439e32be7f03871d0f9773" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.365725 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29535840-t9tlz" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.919772 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.920336 4781 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 00:09:44 crc kubenswrapper[4781]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 00:09:44 crc kubenswrapper[4781]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mv9hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535848-ccctv_openshift-infra(df035290-8e3c-422b-90ac-573b592defcf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 00:09:44 crc kubenswrapper[4781]: > logger="UnhandledError" Feb 27 00:09:44 crc kubenswrapper[4781]: E0227 00:09:44.922438 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podUID="df035290-8e3c-422b-90ac-573b592defcf" Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.955102 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:44 crc kubenswrapper[4781]: I0227 00:09:44.955179 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.189569 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8mth6" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.327737 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.334328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.374908 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.374939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf" event={"ID":"c3667d98-cf94-4751-8191-1d924ea13617","Type":"ContainerDied","Data":"3d67897192f1eb6932753a86ce0f7bd6d344c09b54e771c58e76686037dd2268"} Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.375023 4781 scope.go:117] "RemoveContainer" containerID="2a320ba2160cdf180792174d7de2338c3325b28784d5051920447fb8b1241688" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.378200 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774845979b-t9755" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.378374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774845979b-t9755" event={"ID":"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b","Type":"ContainerDied","Data":"8475af139220f4e889d3615e00c27c5c3f916ced71896c2698d7d1d5d2f40792"} Feb 27 00:09:45 crc kubenswrapper[4781]: E0227 00:09:45.379002 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podUID="df035290-8e3c-422b-90ac-573b592defcf" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407495 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407553 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407588 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") pod \"c3667d98-cf94-4751-8191-1d924ea13617\" (UID: \"c3667d98-cf94-4751-8191-1d924ea13617\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407881 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407916 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.407964 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") pod \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\" (UID: \"7ee82d7c-7686-4b1a-8bb5-59ed2a93471b\") " Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.408299 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409050 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config" (OuterVolumeSpecName: "config") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409067 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409138 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.409153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config" (OuterVolumeSpecName: "config") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.414824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.414851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw" (OuterVolumeSpecName: "kube-api-access-xjsgw") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "kube-api-access-xjsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.417264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n" (OuterVolumeSpecName: "kube-api-access-4qk2n") pod "c3667d98-cf94-4751-8191-1d924ea13617" (UID: "c3667d98-cf94-4751-8191-1d924ea13617"). InnerVolumeSpecName "kube-api-access-4qk2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.418798 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" (UID: "7ee82d7c-7686-4b1a-8bb5-59ed2a93471b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509330 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509364 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509374 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3667d98-cf94-4751-8191-1d924ea13617-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509386 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3667d98-cf94-4751-8191-1d924ea13617-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509396 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qk2n\" (UniqueName: \"kubernetes.io/projected/c3667d98-cf94-4751-8191-1d924ea13617-kube-api-access-4qk2n\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509407 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509418 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjsgw\" (UniqueName: \"kubernetes.io/projected/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-kube-api-access-xjsgw\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509427 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.509434 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.702209 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.706526 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8c487b-4kknf"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.712284 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:45 crc kubenswrapper[4781]: I0227 00:09:45.715494 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774845979b-t9755"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089410 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089751 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089764 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089773 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089780 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089792 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089807 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089812 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089823 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089828 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: E0227 00:09:46.089842 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.089847 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090037 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="678f27fc-d210-4a4f-bd73-090378740da9" containerName="collect-profiles" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" containerName="controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090062 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd741f12-8908-4f25-a2ab-2a9deb826494" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090070 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a90341e-86fb-4819-848b-cdd71b0ac0a7" containerName="pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090078 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e2c481-01ee-461f-bc5b-d09b7ea221c5" containerName="image-pruner" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090092 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3667d98-cf94-4751-8191-1d924ea13617" containerName="route-controller-manager" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.090790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.093459 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094074 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094154 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094318 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.094106 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.114707 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.116019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.121789 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123162 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123271 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123316 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.123422 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.125278 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.129385 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.131367 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220276 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.220595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321694 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.321750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.322836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.323363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.323730 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.324326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.325915 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.328659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.329084 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.339073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"controller-manager-5cf657794c-phnhf\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.343112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"route-controller-manager-8577b6d867-bbk7d\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.416023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:46 crc kubenswrapper[4781]: I0227 00:09:46.435621 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:47 crc kubenswrapper[4781]: I0227 00:09:47.316948 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee82d7c-7686-4b1a-8bb5-59ed2a93471b" path="/var/lib/kubelet/pods/7ee82d7c-7686-4b1a-8bb5-59ed2a93471b/volumes" Feb 27 00:09:47 crc kubenswrapper[4781]: I0227 00:09:47.317765 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3667d98-cf94-4751-8191-1d924ea13617" path="/var/lib/kubelet/pods/c3667d98-cf94-4751-8191-1d924ea13617/volumes" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.265109 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.266265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.268898 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.268948 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.274055 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.351214 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.351345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.453262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.470644 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:48 crc kubenswrapper[4781]: I0227 00:09:48.592899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:09:49 crc kubenswrapper[4781]: I0227 00:09:49.654731 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:49 crc kubenswrapper[4781]: I0227 00:09:49.754585 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.656372 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.657287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.672313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.710911 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.710962 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.711004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.813411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.852390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"installer-9-crc\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:52 crc kubenswrapper[4781]: I0227 00:09:52.981949 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:09:54 crc kubenswrapper[4781]: I0227 00:09:54.955255 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:54 crc kubenswrapper[4781]: I0227 00:09:54.956760 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.363710 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.363867 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zpnxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kztqg_openshift-marketplace(2b050e9e-d6c8-4e27-ad3f-9681553c1539): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.365080 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.379268 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.379405 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8558,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-52xgq_openshift-marketplace(0f286d62-2145-4bbb-91eb-28ffda9b2494): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.380597 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.382817 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.383017 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mqv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9ngbg_openshift-marketplace(baa593f3-06c4-461f-a893-609b07dfd282): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.384324 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.413133 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.413301 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztvqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hcdz5_openshift-marketplace(a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:55 crc kubenswrapper[4781]: E0227 00:09:55.414671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" Feb 27 00:09:56 crc kubenswrapper[4781]: I0227 00:09:56.875774 4781 scope.go:117] "RemoveContainer" containerID="cca3038329bc9717a99aec45d98f04146a5502dd497a4438c9c5be74b15ed23a" Feb 27 00:09:56 crc kubenswrapper[4781]: W0227 00:09:56.882736 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e WatchSource:0}: Error finding container 3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e: Status 404 returned error can't find the container with id 3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899047 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899047 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899072 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.899118 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" Feb 27 00:09:56 crc kubenswrapper[4781]: W0227 00:09:56.925684 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab WatchSource:0}: Error finding container 7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab: Status 404 returned error can't find the container with id 7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.961611 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.961798 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgndh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-42hbx_openshift-marketplace(19ed5401-2778-4266-8bf1-1c7244dac100): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.962954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.980712 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.980883 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tk7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kqrgb_openshift-marketplace(ac30245d-7e42-440c-99a0-60e2ae15cb8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:09:56 crc kubenswrapper[4781]: E0227 00:09:56.985239 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.074279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpnjj"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.373239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 00:09:57 crc kubenswrapper[4781]: W0227 00:09:57.387399 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7c8795e9_9244_4cc4_a297_3aec68bf3588.slice/crio-f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2 WatchSource:0}: Error finding container f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2: Status 404 returned error can't find the container with id f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.447487 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.452722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qjwrj" event={"ID":"a75bfacf-8cf7-4560-8b4a-6e876daa4c8c","Type":"ContainerStarted","Data":"4aa0dd804d65361572e088d045b1114ad255cfdb61d9c66d8a70600e9afb7537"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458317 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458409 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.458446 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.460555 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.470149 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9" exitCode=0 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.470391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.473506 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8e08f49816417f62f0f1608baa02644560149e4178f81c7d3d13162bc75dabde"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.473700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7a34eb1a5ac66ea850a359793ec1389d7e0143f96eeca72531294b980e93a4ab"} Feb 27 00:09:57 crc kubenswrapper[4781]: W0227 00:09:57.476680 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bc89a7_b3b5_4ab5_ad64_4df7cd38b1b9.slice/crio-afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8 WatchSource:0}: Error finding container afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8: Status 404 returned error can't find the container with id afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8 Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.490741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.498223 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c8a6f0698f05db56db7f57e3f2bb2c8c9abc78ba0074a72d0395de435ebed130"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.498250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3491d7e1df67d43284793d5f6326675b7d0fa2567f1c03fbea4368ba2185e97e"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.499762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerStarted","Data":"f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.513211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"38078e4c5db1c573fe3af278cc88f79c3ee0f65a4ef482a0939cf7fb8c97e8cf"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.518311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3292f88e2a346227223c8b7e045f4d492bf2fe48e5000128f262f9fb3fa3d4a3"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.518360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"77219d655cb0cae9e03efcdbb81ebe6929aa8a097422f0b82111b60c7455dd94"} Feb 27 00:09:57 crc kubenswrapper[4781]: I0227 00:09:57.520194 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:09:57 crc kubenswrapper[4781]: E0227 00:09:57.545755 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" Feb 27 00:09:57 crc kubenswrapper[4781]: E0227 00:09:57.546680 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.550390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerStarted","Data":"2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.552978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"e8ae28edd5d5f135e7b1722851782a3a84c982ded51f1807027fdcab0564456f"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.553021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpnjj" event={"ID":"e866e388-01ab-407a-a59b-d0ba6c3f6f22","Type":"ContainerStarted","Data":"ac3241ff1bfedd5f3c256c91fa7e49e33549d884b07d6bd12e0e7e945b3f26ef"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.554144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerStarted","Data":"f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.554173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerStarted","Data":"3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.556134 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646" exitCode=0 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.556156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerStarted","Data":"6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerStarted","Data":"38ad01967f3090afd63325bd381f932e92e8200a882b87d6b61db57aba747513"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562312 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" containerID="cri-o://6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" gracePeriod=30 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.562420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.564937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerStarted","Data":"9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.564984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerStarted","Data":"afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8"} Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565596 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565595 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" containerID="cri-o://9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" gracePeriod=30 Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.565692 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.571776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.599907 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.599893024 podStartE2EDuration="6.599893024s" podCreationTimestamp="2026-02-27 00:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.598736806 +0000 UTC m=+267.856276360" watchObservedRunningTime="2026-02-27 00:09:58.599893024 +0000 UTC m=+267.857432578" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.619560 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" podStartSLOduration=29.619543623 podStartE2EDuration="29.619543623s" podCreationTimestamp="2026-02-27 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.61859596 +0000 UTC m=+267.876135514" watchObservedRunningTime="2026-02-27 00:09:58.619543623 +0000 UTC m=+267.877083167" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.649358 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" podStartSLOduration=29.649343895 podStartE2EDuration="29.649343895s" podCreationTimestamp="2026-02-27 00:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.64673393 +0000 UTC m=+267.904273504" watchObservedRunningTime="2026-02-27 00:09:58.649343895 +0000 UTC m=+267.906883449" Feb 27 00:09:58 crc kubenswrapper[4781]: I0227 00:09:58.699677 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.699653247 podStartE2EDuration="10.699653247s" podCreationTimestamp="2026-02-27 00:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:58.697842812 +0000 UTC m=+267.955382386" watchObservedRunningTime="2026-02-27 00:09:58.699653247 +0000 UTC m=+267.957192811" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.571096 4781 generic.go:334] "Generic (PLEG): container finished" podID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerID="6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" exitCode=0 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.571173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerDied","Data":"6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573188 4781 generic.go:334] "Generic (PLEG): container finished" podID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerID="9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" exitCode=255 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.573237 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerDied","Data":"9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.574363 4781 generic.go:334] "Generic (PLEG): container finished" podID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerID="f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb" exitCode=0 Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.574444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerDied","Data":"f593c27ca30b9776970bc50285b68eeb2a08fa45251c439f46326b20d1ecb4cb"} Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.575281 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-qjwrj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.575326 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qjwrj" podUID="a75bfacf-8cf7-4560-8b4a-6e876daa4c8c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.587357 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpnjj" podStartSLOduration=213.587341259 podStartE2EDuration="3m33.587341259s" podCreationTimestamp="2026-02-27 00:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:09:59.586422356 +0000 UTC m=+268.843961910" watchObservedRunningTime="2026-02-27 00:09:59.587341259 +0000 UTC m=+268.844880813" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.658500 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681592 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:09:59 crc kubenswrapper[4781]: E0227 00:09:59.681821 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681837 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.681933 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" containerName="controller-manager" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.682298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.698384 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725654 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725780 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.725865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") pod \"16dfdff5-f774-4b57-adcf-587eb1a87012\" (UID: \"16dfdff5-f774-4b57-adcf-587eb1a87012\") " Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727338 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727371 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca" (OuterVolumeSpecName: "client-ca") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.727755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config" (OuterVolumeSpecName: "config") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.739888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.739934 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb" (OuterVolumeSpecName: "kube-api-access-2vtpb") pod "16dfdff5-f774-4b57-adcf-587eb1a87012" (UID: "16dfdff5-f774-4b57-adcf-587eb1a87012"). InnerVolumeSpecName "kube-api-access-2vtpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827211 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827344 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827382 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dfdff5-f774-4b57-adcf-587eb1a87012-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827396 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827405 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtpb\" (UniqueName: \"kubernetes.io/projected/16dfdff5-f774-4b57-adcf-587eb1a87012-kube-api-access-2vtpb\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827427 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.827529 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dfdff5-f774-4b57-adcf-587eb1a87012-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928751 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.928872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.930486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.930571 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.931089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.943079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.944585 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.944660 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.945451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"controller-manager-5f9656d97f-jxdvc\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:09:59 crc kubenswrapper[4781]: I0227 00:09:59.999267 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.029992 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.030536 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") pod \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\" (UID: \"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9\") " Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031438 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config" (OuterVolumeSpecName: "config") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031799 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.031868 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.035882 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.035902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7" (OuterVolumeSpecName: "kube-api-access-qpsw7") pod "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" (UID: "52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9"). InnerVolumeSpecName "kube-api-access-qpsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.129509 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:00 crc kubenswrapper[4781]: E0227 00:10:00.130024 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130043 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130150 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" containerName="route-controller-manager" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.130477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134046 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134562 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.134593 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpsw7\" (UniqueName: \"kubernetes.io/projected/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9-kube-api-access-qpsw7\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.136041 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.174597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:00 crc kubenswrapper[4781]: W0227 00:10:00.187306 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25c936ed_5081_4365_87f2_90f0cc29bb4e.slice/crio-a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf WatchSource:0}: Error finding container a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf: Status 404 returned error can't find the container with id a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.236033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.337213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.358370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"auto-csr-approver-29535850-wzxmm\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.461735 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerStarted","Data":"02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerStarted","Data":"a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.584726 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" event={"ID":"16dfdff5-f774-4b57-adcf-587eb1a87012","Type":"ContainerDied","Data":"38ad01967f3090afd63325bd381f932e92e8200a882b87d6b61db57aba747513"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587792 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf657794c-phnhf" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.587822 4781 scope.go:117] "RemoveContainer" containerID="6afdb00675a8fb6a3b5a8c7988539248e23cd32e95fcaa349611040a5a9b0dd3" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.590054 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603145 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8577b6d867-bbk7d_52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/route-controller-manager/0.log" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" event={"ID":"52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9","Type":"ContainerDied","Data":"afebd9373c32a2481ceafebd85a9c76dfa968abcad4e05173bfa83af78d85aa8"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.603313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.608935 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podStartSLOduration=11.608910591 podStartE2EDuration="11.608910591s" podCreationTimestamp="2026-02-27 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:00.60324034 +0000 UTC m=+269.860779904" watchObservedRunningTime="2026-02-27 00:10:00.608910591 +0000 UTC m=+269.866450155" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.609643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerStarted","Data":"ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b"} Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.662487 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.671884 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8577b6d867-bbk7d"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.676442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rnj7" podStartSLOduration=3.989136788 podStartE2EDuration="47.676429861s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.09589659 +0000 UTC m=+225.353436144" lastFinishedPulling="2026-02-27 00:09:59.783189663 +0000 UTC m=+269.040729217" observedRunningTime="2026-02-27 00:10:00.674887933 +0000 UTC m=+269.932427487" watchObservedRunningTime="2026-02-27 00:10:00.676429861 +0000 UTC m=+269.933969415" Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.691447 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.698872 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cf657794c-phnhf"] Feb 27 00:10:00 crc kubenswrapper[4781]: I0227 00:10:00.853833 4781 scope.go:117] "RemoveContainer" containerID="9b2b3a0ab38d37a4eed10a4aaac7da312e7afd3ddf70467638573f0cec7e77bf" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.051316 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.125233 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") pod \"180f65d9-1cb5-411b-a031-6f97c06811d1\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") pod \"180f65d9-1cb5-411b-a031-6f97c06811d1\" (UID: \"180f65d9-1cb5-411b-a031-6f97c06811d1\") " Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.269680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "180f65d9-1cb5-411b-a031-6f97c06811d1" (UID: "180f65d9-1cb5-411b-a031-6f97c06811d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.277787 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "180f65d9-1cb5-411b-a031-6f97c06811d1" (UID: "180f65d9-1cb5-411b-a031-6f97c06811d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.321950 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dfdff5-f774-4b57-adcf-587eb1a87012" path="/var/lib/kubelet/pods/16dfdff5-f774-4b57-adcf-587eb1a87012/volumes" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.323665 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9" path="/var/lib/kubelet/pods/52bc89a7-b3b5-4ab5-ad64-4df7cd38b1b9/volumes" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.359326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:10:01 crc kubenswrapper[4781]: W0227 00:10:01.366362 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6acff23f_a17a_4f43_a7d6_32c8ccf4b084.slice/crio-f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662 WatchSource:0}: Error finding container f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662: Status 404 returned error can't find the container with id f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662 Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.370881 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180f65d9-1cb5-411b-a031-6f97c06811d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.370921 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180f65d9-1cb5-411b-a031-6f97c06811d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180f65d9-1cb5-411b-a031-6f97c06811d1","Type":"ContainerDied","Data":"3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6"} Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616217 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3684593fa27b2e9ecf581ed7146b0988ecfa46c54636096155683cb0b5d113f6" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.616180 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 00:10:01 crc kubenswrapper[4781]: I0227 00:10:01.619268 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerStarted","Data":"f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662"} Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.101959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: E0227 00:10:02.102177 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102188 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102285 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="180f65d9-1cb5-411b-a031-6f97c06811d1" containerName="pruner" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.102981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.105536 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.106794 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107035 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107092 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.107714 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.111554 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282473 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.282779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384468 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.384515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.385750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.385860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.392235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.400552 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"route-controller-manager-779475f565-cjt4b\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.427483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.626820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerStarted","Data":"6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218"} Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.642448 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:02 crc kubenswrapper[4781]: I0227 00:10:02.646239 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dj7h5" podStartSLOduration=21.235177912 podStartE2EDuration="47.646218913s" podCreationTimestamp="2026-02-27 00:09:15 +0000 UTC" firstStartedPulling="2026-02-27 00:09:34.751025076 +0000 UTC m=+244.008564670" lastFinishedPulling="2026-02-27 00:10:01.162066107 +0000 UTC m=+270.419605671" observedRunningTime="2026-02-27 00:10:02.645025183 +0000 UTC m=+271.902564747" watchObservedRunningTime="2026-02-27 00:10:02.646218913 +0000 UTC m=+271.903758467" Feb 27 00:10:03 crc kubenswrapper[4781]: I0227 00:10:03.632930 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerStarted","Data":"71513888e0482af4d621781d19007e82957003c877ba4a7e270d5f9d9e9840db"} Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.177713 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.177783 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.701150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:04 crc kubenswrapper[4781]: I0227 00:10:04.972325 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qjwrj" Feb 27 00:10:05 crc kubenswrapper[4781]: I0227 00:10:05.537741 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:05 crc kubenswrapper[4781]: I0227 00:10:05.537858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.582039 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dj7h5" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" probeResult="failure" output=< Feb 27 00:10:06 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:10:06 crc kubenswrapper[4781]: > Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.647689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerStarted","Data":"313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.648999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerStarted","Data":"a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.650225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerStarted","Data":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.663399 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535848-ccctv" podStartSLOduration=67.952041111 podStartE2EDuration="2m6.663378975s" podCreationTimestamp="2026-02-27 00:08:00 +0000 UTC" firstStartedPulling="2026-02-27 00:09:07.214773423 +0000 UTC m=+216.472312977" lastFinishedPulling="2026-02-27 00:10:05.926111297 +0000 UTC m=+275.183650841" observedRunningTime="2026-02-27 00:10:06.659507019 +0000 UTC m=+275.917046593" watchObservedRunningTime="2026-02-27 00:10:06.663378975 +0000 UTC m=+275.920918529" Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.907801 4781 csr.go:261] certificate signing request csr-jc8qj is approved, waiting to be issued Feb 27 00:10:06 crc kubenswrapper[4781]: I0227 00:10:06.915786 4781 csr.go:257] certificate signing request csr-jc8qj is issued Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.656859 4781 generic.go:334] "Generic (PLEG): container finished" podID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerID="313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d" exitCode=0 Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.656954 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerDied","Data":"313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d"} Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.658891 4781 generic.go:334] "Generic (PLEG): container finished" podID="df035290-8e3c-422b-90ac-573b592defcf" containerID="a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942" exitCode=0 Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.658962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerDied","Data":"a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942"} Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.659248 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.666980 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.683743 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" podStartSLOduration=18.683724897 podStartE2EDuration="18.683724897s" podCreationTimestamp="2026-02-27 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:07.682868166 +0000 UTC m=+276.940407720" watchObservedRunningTime="2026-02-27 00:10:07.683724897 +0000 UTC m=+276.941264441" Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.917565 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-09 00:41:14.368927459 +0000 UTC Feb 27 00:10:07 crc kubenswrapper[4781]: I0227 00:10:07.917610 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7584h31m6.451319371s for next certificate rotation Feb 27 00:10:08 crc kubenswrapper[4781]: I0227 00:10:08.977682 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.088148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") pod \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\" (UID: \"6acff23f-a17a-4f43-a7d6-32c8ccf4b084\") " Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.091730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.093393 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh" (OuterVolumeSpecName: "kube-api-access-5tthh") pod "6acff23f-a17a-4f43-a7d6-32c8ccf4b084" (UID: "6acff23f-a17a-4f43-a7d6-32c8ccf4b084"). InnerVolumeSpecName "kube-api-access-5tthh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.193752 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") pod \"df035290-8e3c-422b-90ac-573b592defcf\" (UID: \"df035290-8e3c-422b-90ac-573b592defcf\") " Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.194128 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tthh\" (UniqueName: \"kubernetes.io/projected/6acff23f-a17a-4f43-a7d6-32c8ccf4b084-kube-api-access-5tthh\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.198118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp" (OuterVolumeSpecName: "kube-api-access-mv9hp") pod "df035290-8e3c-422b-90ac-573b592defcf" (UID: "df035290-8e3c-422b-90ac-573b592defcf"). InnerVolumeSpecName "kube-api-access-mv9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.294982 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9hp\" (UniqueName: \"kubernetes.io/projected/df035290-8e3c-422b-90ac-573b592defcf-kube-api-access-mv9hp\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.667137 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.667434 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" containerID="cri-o://02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" gracePeriod=30 Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535848-ccctv" event={"ID":"df035290-8e3c-422b-90ac-573b592defcf","Type":"ContainerDied","Data":"73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a"} Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73bd0b78edcc81c67b914cc89cfaf8646b9814d5783ad5e9856330864dac671a" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.676250 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535848-ccctv" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" event={"ID":"6acff23f-a17a-4f43-a7d6-32c8ccf4b084","Type":"ContainerDied","Data":"f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662"} Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677860 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f728769b2293efcebad50430f72dd2a7fcc03a69d6e7c8ee03493c1d72f21662" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.677815 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535850-wzxmm" Feb 27 00:10:09 crc kubenswrapper[4781]: I0227 00:10:09.695271 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.000905 4781 patch_prober.go:28] interesting pod/controller-manager-5f9656d97f-jxdvc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.000982 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.686027 4781 generic.go:334] "Generic (PLEG): container finished" podID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerID="02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" exitCode=0 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.686081 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerDied","Data":"02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06"} Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688316 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" exitCode=0 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f"} Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.688495 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" containerID="cri-o://dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" gracePeriod=30 Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.814471 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.858968 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859767 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859834 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859852 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859859 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: E0227 00:10:10.859892 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.859900 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860077 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860095 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="df035290-8e3c-422b-90ac-573b592defcf" containerName="oc" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.860135 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" containerName="controller-manager" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.862501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.873731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.919932 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") pod \"25c936ed-5081-4365-87f2-90f0cc29bb4e\" (UID: \"25c936ed-5081-4365-87f2-90f0cc29bb4e\") " Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920415 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.920450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.921251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca" (OuterVolumeSpecName: "client-ca") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.921373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config" (OuterVolumeSpecName: "config") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.922200 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.926870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp" (OuterVolumeSpecName: "kube-api-access-vg4qp") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "kube-api-access-vg4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:10 crc kubenswrapper[4781]: I0227 00:10:10.927351 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25c936ed-5081-4365-87f2-90f0cc29bb4e" (UID: "25c936ed-5081-4365-87f2-90f0cc29bb4e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025065 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025173 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025257 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25c936ed-5081-4365-87f2-90f0cc29bb4e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025267 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025277 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025286 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg4qp\" (UniqueName: \"kubernetes.io/projected/25c936ed-5081-4365-87f2-90f0cc29bb4e-kube-api-access-vg4qp\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.025295 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25c936ed-5081-4365-87f2-90f0cc29bb4e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.026513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.027228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.027383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.034093 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.046992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"controller-manager-6dccb78d65-ddtvh\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.112709 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125760 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125792 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.125814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") pod \"24aa2757-b776-45d4-b3b5-29f891553b70\" (UID: \"24aa2757-b776-45d4-b3b5-29f891553b70\") " Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.128217 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config" (OuterVolumeSpecName: "config") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.128206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca" (OuterVolumeSpecName: "client-ca") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.132443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.132832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w" (OuterVolumeSpecName: "kube-api-access-qxn2w") pod "24aa2757-b776-45d4-b3b5-29f891553b70" (UID: "24aa2757-b776-45d4-b3b5-29f891553b70"). InnerVolumeSpecName "kube-api-access-qxn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.181223 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.226994 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227029 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24aa2757-b776-45d4-b3b5-29f891553b70-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227038 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24aa2757-b776-45d4-b3b5-29f891553b70-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.227050 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxn2w\" (UniqueName: \"kubernetes.io/projected/24aa2757-b776-45d4-b3b5-29f891553b70-kube-api-access-qxn2w\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.428437 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:11 crc kubenswrapper[4781]: W0227 00:10:11.435477 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62880941_3b5d_4517_b0df_8c5548f8298d.slice/crio-b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d WatchSource:0}: Error finding container b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d: Status 404 returned error can't find the container with id b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.699111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerStarted","Data":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.700612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.701465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerStarted","Data":"b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702682 4781 generic.go:334] "Generic (PLEG): container finished" podID="24aa2757-b776-45d4-b3b5-29f891553b70" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" exitCode=0 Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702733 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerDied","Data":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702766 4781 scope.go:117] "RemoveContainer" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.702755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b" event={"ID":"24aa2757-b776-45d4-b3b5-29f891553b70","Type":"ContainerDied","Data":"71513888e0482af4d621781d19007e82957003c877ba4a7e270d5f9d9e9840db"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.706479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" event={"ID":"25c936ed-5081-4365-87f2-90f0cc29bb4e","Type":"ContainerDied","Data":"a674bf23a48c6f2ceef7ad1a8e9b60a25ce02d22f7541a7f5589c91470d8adaf"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.706608 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f9656d97f-jxdvc" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.711238 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98" exitCode=0 Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.711307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98"} Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.717877 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9ngbg" podStartSLOduration=3.735382734 podStartE2EDuration="58.717858732s" podCreationTimestamp="2026-02-27 00:09:13 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.175931911 +0000 UTC m=+225.433471465" lastFinishedPulling="2026-02-27 00:10:11.158407909 +0000 UTC m=+280.415947463" observedRunningTime="2026-02-27 00:10:11.716802006 +0000 UTC m=+280.974341590" watchObservedRunningTime="2026-02-27 00:10:11.717858732 +0000 UTC m=+280.975398286" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.724622 4781 scope.go:117] "RemoveContainer" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: E0227 00:10:11.725209 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": container with ID starting with dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba not found: ID does not exist" containerID="dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.725273 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba"} err="failed to get container status \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": rpc error: code = NotFound desc = could not find container \"dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba\": container with ID starting with dc253f50f4a338e0750935df6010a3f9034e8e95c053110bd1936aa2706a58ba not found: ID does not exist" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.725317 4781 scope.go:117] "RemoveContainer" containerID="02ad07479bdf08eecb193dd8838b03c4aed4d9358ea1b1b4013fddaa4dd4cb06" Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.735402 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.752573 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779475f565-cjt4b"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.756929 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:11 crc kubenswrapper[4781]: I0227 00:10:11.759740 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f9656d97f-jxdvc"] Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.718085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerStarted","Data":"b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353"} Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.719557 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.734968 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" exitCode=0 Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.735023 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.736234 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.746524 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" podStartSLOduration=3.7464937320000002 podStartE2EDuration="3.746493732s" podCreationTimestamp="2026-02-27 00:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:12.742330348 +0000 UTC m=+281.999869902" watchObservedRunningTime="2026-02-27 00:10:12.746493732 +0000 UTC m=+282.004033276" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895557 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895664 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.895729 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.896483 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:10:12 crc kubenswrapper[4781]: I0227 00:10:12.896564 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" gracePeriod=600 Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.112697 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:13 crc kubenswrapper[4781]: E0227 00:10:13.112941 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.112953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.113051 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" containerName="route-controller-manager" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.113511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.115603 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.117798 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.117802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.118010 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.119538 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.119579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.129268 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271560 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.271789 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.323234 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24aa2757-b776-45d4-b3b5-29f891553b70" path="/var/lib/kubelet/pods/24aa2757-b776-45d4-b3b5-29f891553b70/volumes" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.324118 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c936ed-5081-4365-87f2-90f0cc29bb4e" path="/var/lib/kubelet/pods/25c936ed-5081-4365-87f2-90f0cc29bb4e/volumes" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372548 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372922 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.372970 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.373840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.373884 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.379481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.391334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"route-controller-manager-586cccdbf9-r7jxl\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.433151 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.745552 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" exitCode=0 Feb 27 00:10:13 crc kubenswrapper[4781]: I0227 00:10:13.745688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089"} Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.076539 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.076602 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.159029 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:14 crc kubenswrapper[4781]: I0227 00:10:14.231032 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:15 crc kubenswrapper[4781]: I0227 00:10:15.600797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:15 crc kubenswrapper[4781]: I0227 00:10:15.722322 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:18 crc kubenswrapper[4781]: I0227 00:10:18.147940 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:18 crc kubenswrapper[4781]: I0227 00:10:18.148371 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rnj7" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" containerID="cri-o://ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" gracePeriod=2 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.146499 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.147282 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dj7h5" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" containerID="cri-o://6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" gracePeriod=2 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.796507 4781 generic.go:334] "Generic (PLEG): container finished" podID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerID="ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" exitCode=0 Feb 27 00:10:19 crc kubenswrapper[4781]: I0227 00:10:19.796572 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b"} Feb 27 00:10:20 crc kubenswrapper[4781]: I0227 00:10:20.802673 4781 generic.go:334] "Generic (PLEG): container finished" podID="514049ae-2568-416f-9705-524c2bf74cbd" containerID="6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" exitCode=0 Feb 27 00:10:20 crc kubenswrapper[4781]: I0227 00:10:20.802709 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.786178 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.790878 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rnj7" event={"ID":"97e44b43-3c8e-4065-a51b-aa3f27c36712","Type":"ContainerDied","Data":"b4c78b3d5964c2a730f268fed158cc29cd746663976e63644c0b8dcc232f4b12"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rnj7" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.810583 4781 scope.go:117] "RemoveContainer" containerID="ae64f7e1eb7e26c6ed833c20f9f08953b97623d771df012e1a6915bbe9ef458b" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.814658 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dj7h5" event={"ID":"514049ae-2568-416f-9705-524c2bf74cbd","Type":"ContainerDied","Data":"45d5d509e8ad0dc50e09ff3936cc7a26189c6c645b18672248f0a72722749ca4"} Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.814740 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dj7h5" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889821 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889875 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.889924 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890020 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") pod \"97e44b43-3c8e-4065-a51b-aa3f27c36712\" (UID: \"97e44b43-3c8e-4065-a51b-aa3f27c36712\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890053 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") pod \"514049ae-2568-416f-9705-524c2bf74cbd\" (UID: \"514049ae-2568-416f-9705-524c2bf74cbd\") " Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.890711 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities" (OuterVolumeSpecName: "utilities") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.891133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities" (OuterVolumeSpecName: "utilities") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.896154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr" (OuterVolumeSpecName: "kube-api-access-w6frr") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "kube-api-access-w6frr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.903813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv" (OuterVolumeSpecName: "kube-api-access-z44vv") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "kube-api-access-z44vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.925414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97e44b43-3c8e-4065-a51b-aa3f27c36712" (UID: "97e44b43-3c8e-4065-a51b-aa3f27c36712"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992127 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6frr\" (UniqueName: \"kubernetes.io/projected/97e44b43-3c8e-4065-a51b-aa3f27c36712-kube-api-access-w6frr\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992174 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992193 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992210 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97e44b43-3c8e-4065-a51b-aa3f27c36712-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:21 crc kubenswrapper[4781]: I0227 00:10:21.992229 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z44vv\" (UniqueName: \"kubernetes.io/projected/514049ae-2568-416f-9705-524c2bf74cbd-kube-api-access-z44vv\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.034780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "514049ae-2568-416f-9705-524c2bf74cbd" (UID: "514049ae-2568-416f-9705-524c2bf74cbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.093422 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/514049ae-2568-416f-9705-524c2bf74cbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.155368 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.162290 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rnj7"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.170909 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.174894 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dj7h5"] Feb 27 00:10:22 crc kubenswrapper[4781]: I0227 00:10:22.589569 4781 scope.go:117] "RemoveContainer" containerID="9c12fe8df9037297d8af4eedba0d4e04fa1c5be02d943f1b25318346033b7fc9" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.316520 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514049ae-2568-416f-9705-524c2bf74cbd" path="/var/lib/kubelet/pods/514049ae-2568-416f-9705-524c2bf74cbd/volumes" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.318828 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" path="/var/lib/kubelet/pods/97e44b43-3c8e-4065-a51b-aa3f27c36712/volumes" Feb 27 00:10:23 crc kubenswrapper[4781]: I0227 00:10:23.441770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:23 crc kubenswrapper[4781]: W0227 00:10:23.948966 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04ba107_8dd1_4d8d_88d3_5a762f6c60f1.slice/crio-f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d WatchSource:0}: Error finding container f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d: Status 404 returned error can't find the container with id f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.008417 4781 scope.go:117] "RemoveContainer" containerID="b414a361ce30e28fdc5bc47f53f766e6427e2ccb8cfe76be4eed8ce4ee48ebca" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.088989 4781 scope.go:117] "RemoveContainer" containerID="6f6652462de4a86b8baaec52317db0d18a1393af64c4cae6b12430f98c10a218" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.136853 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.165923 4781 scope.go:117] "RemoveContainer" containerID="1701a618ae78c1968b5098401e32f2c349b0a0fd1ab9fdf4f23fd86a66112646" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.207029 4781 scope.go:117] "RemoveContainer" containerID="39f26f7fa9552ef0082d4338be84e32dc690ddb73a7ed4be83f09421026f56c7" Feb 27 00:10:24 crc kubenswrapper[4781]: I0227 00:10:24.842061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerStarted","Data":"f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.852916 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.853093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.856804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerStarted","Data":"ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.861090 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.861175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.862858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerStarted","Data":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.863259 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.865676 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerStarted","Data":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.867746 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" exitCode=0 Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.867844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.870451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.874758 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.904361 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kqrgb" podStartSLOduration=7.773629081 podStartE2EDuration="1m14.904342472s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:12.932463446 +0000 UTC m=+222.190003000" lastFinishedPulling="2026-02-27 00:10:20.063176827 +0000 UTC m=+289.320716391" observedRunningTime="2026-02-27 00:10:25.901585663 +0000 UTC m=+295.159125217" watchObservedRunningTime="2026-02-27 00:10:25.904342472 +0000 UTC m=+295.161882036" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.925240 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" podStartSLOduration=16.925222371 podStartE2EDuration="16.925222371s" podCreationTimestamp="2026-02-27 00:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:25.922192716 +0000 UTC m=+295.179732270" watchObservedRunningTime="2026-02-27 00:10:25.925222371 +0000 UTC m=+295.182761925" Feb 27 00:10:25 crc kubenswrapper[4781]: I0227 00:10:25.963662 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hcdz5" podStartSLOduration=4.126813534 podStartE2EDuration="1m11.963645507s" podCreationTimestamp="2026-02-27 00:09:14 +0000 UTC" firstStartedPulling="2026-02-27 00:09:16.111037399 +0000 UTC m=+225.368576953" lastFinishedPulling="2026-02-27 00:10:23.947869342 +0000 UTC m=+293.205408926" observedRunningTime="2026-02-27 00:10:25.960659673 +0000 UTC m=+295.218199227" watchObservedRunningTime="2026-02-27 00:10:25.963645507 +0000 UTC m=+295.221185061" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.877347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerStarted","Data":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.879943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerStarted","Data":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.882563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerStarted","Data":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.917667 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kztqg" podStartSLOduration=2.353346412 podStartE2EDuration="1m15.917647129s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:13.005814103 +0000 UTC m=+222.263353657" lastFinishedPulling="2026-02-27 00:10:26.57011482 +0000 UTC m=+295.827654374" observedRunningTime="2026-02-27 00:10:26.899220351 +0000 UTC m=+296.156759905" watchObservedRunningTime="2026-02-27 00:10:26.917647129 +0000 UTC m=+296.175186693" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.918307 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52xgq" podStartSLOduration=2.61737827 podStartE2EDuration="1m14.918301136s" podCreationTimestamp="2026-02-27 00:09:12 +0000 UTC" firstStartedPulling="2026-02-27 00:09:14.01346862 +0000 UTC m=+223.271008174" lastFinishedPulling="2026-02-27 00:10:26.314391486 +0000 UTC m=+295.571931040" observedRunningTime="2026-02-27 00:10:26.916513651 +0000 UTC m=+296.174053215" watchObservedRunningTime="2026-02-27 00:10:26.918301136 +0000 UTC m=+296.175840690" Feb 27 00:10:26 crc kubenswrapper[4781]: I0227 00:10:26.937751 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-42hbx" podStartSLOduration=2.604352457 podStartE2EDuration="1m15.937737439s" podCreationTimestamp="2026-02-27 00:09:11 +0000 UTC" firstStartedPulling="2026-02-27 00:09:12.937484181 +0000 UTC m=+222.195023735" lastFinishedPulling="2026-02-27 00:10:26.270869153 +0000 UTC m=+295.528408717" observedRunningTime="2026-02-27 00:10:26.934987611 +0000 UTC m=+296.192527165" watchObservedRunningTime="2026-02-27 00:10:26.937737439 +0000 UTC m=+296.195276993" Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.691726 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.692394 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" containerID="cri-o://b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" gracePeriod=30 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.780386 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.780598 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" containerID="cri-o://97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" gracePeriod=30 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.900787 4781 generic.go:334] "Generic (PLEG): container finished" podID="62880941-3b5d-4517-b0df-8c5548f8298d" containerID="b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" exitCode=0 Feb 27 00:10:29 crc kubenswrapper[4781]: I0227 00:10:29.901082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerDied","Data":"b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.263736 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.271038 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317327 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317438 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317545 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") pod \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\" (UID: \"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.317603 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") pod \"62880941-3b5d-4517-b0df-8c5548f8298d\" (UID: \"62880941-3b5d-4517-b0df-8c5548f8298d\") " Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318769 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318784 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config" (OuterVolumeSpecName: "config") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318775 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca" (OuterVolumeSpecName: "client-ca") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.318836 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config" (OuterVolumeSpecName: "config") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.319238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323360 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323391 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.323442 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7" (OuterVolumeSpecName: "kube-api-access-txcb7") pod "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" (UID: "b04ba107-8dd1-4d8d-88d3-5a762f6c60f1"). InnerVolumeSpecName "kube-api-access-txcb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.329902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm" (OuterVolumeSpecName: "kube-api-access-h6rqm") pod "62880941-3b5d-4517-b0df-8c5548f8298d" (UID: "62880941-3b5d-4517-b0df-8c5548f8298d"). InnerVolumeSpecName "kube-api-access-h6rqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418676 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418720 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418734 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418749 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62880941-3b5d-4517-b0df-8c5548f8298d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418762 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6rqm\" (UniqueName: \"kubernetes.io/projected/62880941-3b5d-4517-b0df-8c5548f8298d-kube-api-access-h6rqm\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418775 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txcb7\" (UniqueName: \"kubernetes.io/projected/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-kube-api-access-txcb7\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418787 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418799 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.418810 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62880941-3b5d-4517-b0df-8c5548f8298d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912154 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912160 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dccb78d65-ddtvh" event={"ID":"62880941-3b5d-4517-b0df-8c5548f8298d","Type":"ContainerDied","Data":"b10681d24475519937e6ebec881f7fb610590065caed7686034ef5962eec396d"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.912253 4781 scope.go:117] "RemoveContainer" containerID="b8e38f1e216924c6d8d1fbab43dc9e21f5123054b598be25b535e4702b38b353" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914674 4781 generic.go:334] "Generic (PLEG): container finished" podID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" exitCode=0 Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914757 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerDied","Data":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.914801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl" event={"ID":"b04ba107-8dd1-4d8d-88d3-5a762f6c60f1","Type":"ContainerDied","Data":"f45bacdd853d75d92adb8b303b061914233c1920428a49de04d9a44ec247f82d"} Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.958503 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.958801 4781 scope.go:117] "RemoveContainer" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.961467 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dccb78d65-ddtvh"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.976840 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.983071 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-586cccdbf9-r7jxl"] Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.989134 4781 scope.go:117] "RemoveContainer" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: E0227 00:10:30.990135 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": container with ID starting with 97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5 not found: ID does not exist" containerID="97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5" Feb 27 00:10:30 crc kubenswrapper[4781]: I0227 00:10:30.990174 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5"} err="failed to get container status \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": rpc error: code = NotFound desc = could not find container \"97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5\": container with ID starting with 97648a6a7e717d8f52757923ad7a40f42501a7856e07525cea9b570b3b64fec5 not found: ID does not exist" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.062758 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.150675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151659 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151702 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151729 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151749 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151790 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151810 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151838 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151856 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151884 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151903 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151933 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-content" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.151979 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.151998 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="extract-utilities" Feb 27 00:10:31 crc kubenswrapper[4781]: E0227 00:10:31.152023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152043 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152302 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="514049ae-2568-416f-9705-524c2bf74cbd" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152348 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" containerName="route-controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152373 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e44b43-3c8e-4065-a51b-aa3f27c36712" containerName="registry-server" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.152405 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" containerName="controller-manager" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.153386 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.154512 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.155320 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.160134 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.160524 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.163621 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164183 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164439 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164705 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.164988 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166439 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.166569 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.169392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.169611 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.182210 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.197750 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.201429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.319283 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62880941-3b5d-4517-b0df-8c5548f8298d" path="/var/lib/kubelet/pods/62880941-3b5d-4517-b0df-8c5548f8298d/volumes" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.319851 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04ba107-8dd1-4d8d-88d3-5a762f6c60f1" path="/var/lib/kubelet/pods/b04ba107-8dd1-4d8d-88d3-5a762f6c60f1/volumes" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.344991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345241 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.345320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446774 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446885 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.446993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.455687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.455983 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.456102 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.457080 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.458355 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-config\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-config\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459243 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459374 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.459983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-client-ca\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.460115 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b72f5f5-6a79-467f-b65c-4079430ea22c-client-ca\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.464495 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b72f5f5-6a79-467f-b65c-4079430ea22c-serving-cert\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.469368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3849839b-223f-4c16-8aca-0f7b82e30586-proxy-ca-bundles\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.470376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.471931 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.474923 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3849839b-223f-4c16-8aca-0f7b82e30586-serving-cert\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.481154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.483142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.492513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4stn\" (UniqueName: \"kubernetes.io/projected/5b72f5f5-6a79-467f-b65c-4079430ea22c-kube-api-access-n4stn\") pod \"route-controller-manager-568796b7d7-sxr4c\" (UID: \"5b72f5f5-6a79-467f-b65c-4079430ea22c\") " pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.496545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h7w6\" (UniqueName: \"kubernetes.io/projected/3849839b-223f-4c16-8aca-0f7b82e30586-kube-api-access-9h7w6\") pod \"controller-manager-5b77c846cc-7b4k9\" (UID: \"3849839b-223f-4c16-8aca-0f7b82e30586\") " pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.508476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.515331 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.733824 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c"] Feb 27 00:10:31 crc kubenswrapper[4781]: W0227 00:10:31.749960 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b72f5f5_6a79_467f_b65c_4079430ea22c.slice/crio-3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1 WatchSource:0}: Error finding container 3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1: Status 404 returned error can't find the container with id 3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1 Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.791542 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.800841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.846423 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.847052 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.909992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.923075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" event={"ID":"5b72f5f5-6a79-467f-b65c-4079430ea22c","Type":"ContainerStarted","Data":"3505b3780dca1dddfaee8c72b7f6f032411eb02bef4488e644d77b4d57ca8cb1"} Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.972582 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.992075 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:31 crc kubenswrapper[4781]: I0227 00:10:31.992150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.041693 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.192896 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.192944 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.238415 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.294455 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b77c846cc-7b4k9"] Feb 27 00:10:32 crc kubenswrapper[4781]: W0227 00:10:32.295246 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3849839b_223f_4c16_8aca_0f7b82e30586.slice/crio-26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254 WatchSource:0}: Error finding container 26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254: Status 404 returned error can't find the container with id 26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254 Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.441664 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.441746 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.484128 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.927793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" event={"ID":"3849839b-223f-4c16-8aca-0f7b82e30586","Type":"ContainerStarted","Data":"46f45b8f3a4098b7387e66d8e638b26e0546877ca1f6e13ddb0da4ec6df1e284"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.928127 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" event={"ID":"3849839b-223f-4c16-8aca-0f7b82e30586","Type":"ContainerStarted","Data":"26f5df4976ebf4d2761f889a8f03fbf5e4ed1b2fd4e3be10a9c60287deb69254"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.928806 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.931999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" event={"ID":"5b72f5f5-6a79-467f-b65c-4079430ea22c","Type":"ContainerStarted","Data":"8a239fbb3c7f0ce079f37f476f358d6bc0c3a1ec38e0b9805a63bb642f15f6cf"} Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.932042 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.933895 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.939055 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.967830 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b77c846cc-7b4k9" podStartSLOduration=3.967809545 podStartE2EDuration="3.967809545s" podCreationTimestamp="2026-02-27 00:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:32.949539061 +0000 UTC m=+302.207078615" watchObservedRunningTime="2026-02-27 00:10:32.967809545 +0000 UTC m=+302.225349099" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.976153 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.988344 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-568796b7d7-sxr4c" podStartSLOduration=3.988325816 podStartE2EDuration="3.988325816s" podCreationTimestamp="2026-02-27 00:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:10:32.983188658 +0000 UTC m=+302.240728222" watchObservedRunningTime="2026-02-27 00:10:32.988325816 +0000 UTC m=+302.245865380" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.989043 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:10:32 crc kubenswrapper[4781]: I0227 00:10:32.994189 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:33 crc kubenswrapper[4781]: I0227 00:10:33.596563 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2zhrk"] Feb 27 00:10:34 crc kubenswrapper[4781]: I0227 00:10:34.740782 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52xgq"] Feb 27 00:10:34 crc kubenswrapper[4781]: I0227 00:10:34.941823 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52xgq" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" containerID="cri-o://257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" gracePeriod=2 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.004218 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.004288 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.065420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.466119 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.605813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") pod \"0f286d62-2145-4bbb-91eb-28ffda9b2494\" (UID: \"0f286d62-2145-4bbb-91eb-28ffda9b2494\") " Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.606668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities" (OuterVolumeSpecName: "utilities") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.616606 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617044 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617057 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617071 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-content" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617077 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-content" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.617088 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-utilities" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617094 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="extract-utilities" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617197 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerName="registry-server" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.617543 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618525 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618847 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618871 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618887 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618916 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.618984 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" gracePeriod=15 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.624764 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625164 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625189 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625210 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625225 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625243 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625256 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625285 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625302 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625322 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625334 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625355 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625368 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625388 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625400 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625418 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625431 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625448 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625461 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 00:10:35 crc kubenswrapper[4781]: E0227 00:10:35.625481 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625494 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625728 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625750 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625772 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625791 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625811 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625829 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625850 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.625882 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.626262 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.627751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558" (OuterVolumeSpecName: "kube-api-access-f8558") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "kube-api-access-f8558". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.673177 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f286d62-2145-4bbb-91eb-28ffda9b2494" (UID: "0f286d62-2145-4bbb-91eb-28ffda9b2494"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707887 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707931 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.707992 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708164 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708175 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8558\" (UniqueName: \"kubernetes.io/projected/0f286d62-2145-4bbb-91eb-28ffda9b2494-kube-api-access-f8558\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.708184 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f286d62-2145-4bbb-91eb-28ffda9b2494-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809338 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.809590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.950786 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.952914 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953876 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953914 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953932 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.953948 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" exitCode=2 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.954019 4781 scope.go:117] "RemoveContainer" containerID="556772fc7ba30a8b95568a66f16337bf404227a709c709d343918d3bd6e17ba5" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.956231 4781 generic.go:334] "Generic (PLEG): container finished" podID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerID="2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.956314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerDied","Data":"2c434d493ffe4d5672fb6269468215eb15ce1d96ef38aac19ec03432d5d7c9b5"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.957096 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.957733 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961522 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f286d62-2145-4bbb-91eb-28ffda9b2494" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" exitCode=0 Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961593 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52xgq" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.961754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52xgq" event={"ID":"0f286d62-2145-4bbb-91eb-28ffda9b2494","Type":"ContainerDied","Data":"dc9d59b8ab934cad32f1842b836646a3832e9408664f5c6c345f309f196516de"} Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.962318 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.962735 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.963266 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.978544 4781 scope.go:117] "RemoveContainer" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.987531 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.989372 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:35 crc kubenswrapper[4781]: I0227 00:10:35.990121 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.005054 4781 scope.go:117] "RemoveContainer" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.016076 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.016896 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.017249 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.017747 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.018366 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.032442 4781 scope.go:117] "RemoveContainer" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.050980 4781 scope.go:117] "RemoveContainer" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.051536 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": container with ID starting with 257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8 not found: ID does not exist" containerID="257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.051582 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8"} err="failed to get container status \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": rpc error: code = NotFound desc = could not find container \"257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8\": container with ID starting with 257d15c87d7e86d0b22fe731221ea29f1baa1f76ffdd99d32b45f52129583bd8 not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.051617 4781 scope.go:117] "RemoveContainer" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.052220 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": container with ID starting with 4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f not found: ID does not exist" containerID="4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052267 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f"} err="failed to get container status \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": rpc error: code = NotFound desc = could not find container \"4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f\": container with ID starting with 4a5aea3fd523014c3b1508c7df954cc0b54bc1a3d937ac4cfe23aff82780bb4f not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052297 4781 scope.go:117] "RemoveContainer" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: E0227 00:10:36.052588 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": container with ID starting with 0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce not found: ID does not exist" containerID="0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.052654 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce"} err="failed to get container status \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": rpc error: code = NotFound desc = could not find container \"0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce\": container with ID starting with 0aa76e86a3a5fbb4cc2b305465d6d4d5e1add388f317a872da345b8ab05c1fce not found: ID does not exist" Feb 27 00:10:36 crc kubenswrapper[4781]: I0227 00:10:36.975093 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: E0227 00:10:37.336088 4781 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" volumeName="registry-storage" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.407812 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.409830 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.410533 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.411014 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440123 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440568 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") pod \"7c8795e9-9244-4cc4-a297-3aec68bf3588\" (UID: \"7c8795e9-9244-4cc4-a297-3aec68bf3588\") " Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.440306 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.441591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock" (OuterVolumeSpecName: "var-lock") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.446776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7c8795e9-9244-4cc4-a297-3aec68bf3588" (UID: "7c8795e9-9244-4cc4-a297-3aec68bf3588"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542900 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542934 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c8795e9-9244-4cc4-a297-3aec68bf3588-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.542945 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7c8795e9-9244-4cc4-a297-3aec68bf3588-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7c8795e9-9244-4cc4-a297-3aec68bf3588","Type":"ContainerDied","Data":"f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2"} Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985530 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39f3390eb5e42a403a575333d110cbe5ece9b7617819b5c4f74d934848ba9f2" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.985111 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.988367 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989345 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989489 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.989902 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990346 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990745 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" exitCode=0 Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990736 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.990783 4781 scope.go:117] "RemoveContainer" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:37 crc kubenswrapper[4781]: I0227 00:10:37.991457 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.006723 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.007251 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.007704 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.008135 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.008581 4781 scope.go:117] "RemoveContainer" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.022593 4781 scope.go:117] "RemoveContainer" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.040767 4781 scope.go:117] "RemoveContainer" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046842 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.046961 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047015 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047228 4781 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047257 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.047275 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.060017 4781 scope.go:117] "RemoveContainer" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.078538 4781 scope.go:117] "RemoveContainer" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095416 4781 scope.go:117] "RemoveContainer" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.095836 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": container with ID starting with b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226 not found: ID does not exist" containerID="b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095960 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226"} err="failed to get container status \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": rpc error: code = NotFound desc = could not find container \"b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226\": container with ID starting with b2116f0be262ad21b83cc347485f612d323869f0e2c9315b5d1ad89215c1d226 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.095990 4781 scope.go:117] "RemoveContainer" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.096195 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": container with ID starting with 0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b not found: ID does not exist" containerID="0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096232 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b"} err="failed to get container status \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": rpc error: code = NotFound desc = could not find container \"0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b\": container with ID starting with 0b95fca159c4aca1038e599603384b79501ced6328094ce10faa9547ac8c742b not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096247 4781 scope.go:117] "RemoveContainer" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.096579 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": container with ID starting with 543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d not found: ID does not exist" containerID="543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096612 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d"} err="failed to get container status \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": rpc error: code = NotFound desc = could not find container \"543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d\": container with ID starting with 543bbbc9a35b31119035772938a8db3fbf63aa29ff658d0e85f940c79954ea8d not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.096654 4781 scope.go:117] "RemoveContainer" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097075 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": container with ID starting with 6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5 not found: ID does not exist" containerID="6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097110 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5"} err="failed to get container status \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": rpc error: code = NotFound desc = could not find container \"6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5\": container with ID starting with 6862d990889888d5ad60f5e5178ed5039796025fdc566076e8286a5031d635d5 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097130 4781 scope.go:117] "RemoveContainer" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097476 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": container with ID starting with 4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c not found: ID does not exist" containerID="4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097499 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c"} err="failed to get container status \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": rpc error: code = NotFound desc = could not find container \"4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c\": container with ID starting with 4be6f2d1a1982da39bd0787776b3993ac0754ed6f79f419f6fbce8018dedf84c not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097516 4781 scope.go:117] "RemoveContainer" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: E0227 00:10:38.097794 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": container with ID starting with 240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3 not found: ID does not exist" containerID="240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.097811 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3"} err="failed to get container status \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": rpc error: code = NotFound desc = could not find container \"240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3\": container with ID starting with 240bae57340e87b5c1ecb63f06eab827300f74b7337667824b46cd57cb844ce3 not found: ID does not exist" Feb 27 00:10:38 crc kubenswrapper[4781]: I0227 00:10:38.996777 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014015 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014335 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.014761 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.015081 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:39 crc kubenswrapper[4781]: I0227 00:10:39.315947 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 27 00:10:40 crc kubenswrapper[4781]: E0227 00:10:40.656574 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:40 crc kubenswrapper[4781]: I0227 00:10:40.657376 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:40 crc kubenswrapper[4781]: W0227 00:10:40.697563 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783 WatchSource:0}: Error finding container 2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783: Status 404 returned error can't find the container with id 2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783 Feb 27 00:10:40 crc kubenswrapper[4781]: E0227 00:10:40.704374 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f1fb755d17ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,LastTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.014116 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2ce9262d5fb63261e35b2cd7782ae4d70d1a8c54505505b786cc12676dea8783"} Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.318372 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.318925 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:41 crc kubenswrapper[4781]: I0227 00:10:41.319506 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.026345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae"} Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.027429 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.027563 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.028206 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.028898 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.753578 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.754804 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.755505 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.756022 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.756544 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:42 crc kubenswrapper[4781]: I0227 00:10:42.756609 4781 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.757115 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Feb 27 00:10:42 crc kubenswrapper[4781]: E0227 00:10:42.958700 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.033767 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.359917 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.567385 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T00:10:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.567918 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568370 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568644 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568930 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:43 crc kubenswrapper[4781]: E0227 00:10:43.568956 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 00:10:44 crc kubenswrapper[4781]: E0227 00:10:44.161475 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Feb 27 00:10:44 crc kubenswrapper[4781]: E0227 00:10:44.511950 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f1fb755d17ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,LastTimestamp:2026-02-27 00:10:40.701700013 +0000 UTC m=+309.959239607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 00:10:45 crc kubenswrapper[4781]: E0227 00:10:45.762154 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.308730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.310578 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.312401 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.312964 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.328400 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.328445 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:47 crc kubenswrapper[4781]: E0227 00:10:47.328972 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:47 crc kubenswrapper[4781]: I0227 00:10:47.330054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065600 4781 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d67e71348f506acedfd49fdb01193f240453b3872fba7f2a05afdc150dc45413" exitCode=0 Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d67e71348f506acedfd49fdb01193f240453b3872fba7f2a05afdc150dc45413"} Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.065753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f2ce1cb1f0fff81502f09902c57e6d3b5f4c1bac68a1f8eb9b62c40a247d6e03"} Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066323 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066381 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:48 crc kubenswrapper[4781]: E0227 00:10:48.066928 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.066971 4781 status_manager.go:851] "Failed to get status for pod" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" pod="openshift-marketplace/certified-operators-52xgq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-52xgq\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.067683 4781 status_manager.go:851] "Failed to get status for pod" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:48 crc kubenswrapper[4781]: I0227 00:10:48.068160 4781 status_manager.go:851] "Failed to get status for pod" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" pod="openshift-marketplace/redhat-operators-hcdz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-hcdz5\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079092 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73b4ddb5f02dd3ec03c7150c2b525684a96ea163e30959b1ed9e9a9674ccf851"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079408 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"552e0f533e4924e6179bea2f9c4b1fb5e7d0546b88086c647c01546499d8c0d4"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e1e3d72e819f646e1d954dce8e87cc4f81172c85f1fe1c9c812b5e508ca8ebf"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.079426 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3443d9392221569c894c37e3e3dc0d3d0be3bd7cf0464ea3ca776bc75e620169"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.081916 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082535 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082581 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477" exitCode=1 Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.082623 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477"} Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.083130 4781 scope.go:117] "RemoveContainer" containerID="9607db636af16914ff311df9c254406cd6a32326a229fc879bd3923eba2ad477" Feb 27 00:10:49 crc kubenswrapper[4781]: I0227 00:10:49.218091 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.096244 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.097468 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.097615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74e637562fda9341501dc3a4f8ff7bcfb06a0ef864a2010d7f005ad8286d96b1"} Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.101956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf722891fbe8788a067a754a957ea026cbd8eccc9ffa5377e7b75e0242c2f3d1"} Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102242 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102361 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:50 crc kubenswrapper[4781]: I0227 00:10:50.102393 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.330537 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.330927 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:52 crc kubenswrapper[4781]: I0227 00:10:52.338060 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002554 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.002766 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.113336 4781 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.132421 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.132450 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.137479 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:10:55 crc kubenswrapper[4781]: I0227 00:10:55.139669 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="456df73e-2d2d-4980-b2e6-9e45d9cd002b" Feb 27 00:10:56 crc kubenswrapper[4781]: I0227 00:10:56.138186 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:56 crc kubenswrapper[4781]: I0227 00:10:56.138222 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:10:58 crc kubenswrapper[4781]: I0227 00:10:58.633429 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" containerID="cri-o://126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" gracePeriod=15 Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.158083 4781 generic.go:334] "Generic (PLEG): container finished" podID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerID="126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" exitCode=0 Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.158141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerDied","Data":"126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809"} Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.218707 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.227175 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366685 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366789 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366876 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366911 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.366974 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367005 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.367081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") pod \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\" (UID: \"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b\") " Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368458 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.368551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.372903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.373428 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.373657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374203 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq" (OuterVolumeSpecName: "kube-api-access-kr2kq") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "kube-api-access-kr2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.374905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.375150 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.375778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" (UID: "cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.470754 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471022 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr2kq\" (UniqueName: \"kubernetes.io/projected/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-kube-api-access-kr2kq\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471210 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471305 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471394 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471481 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471583 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471721 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471818 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.471908 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472025 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472131 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472225 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 00:10:59 crc kubenswrapper[4781]: I0227 00:10:59.472316 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" event={"ID":"cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b","Type":"ContainerDied","Data":"8fca182de0e27c23c7179b6016938f296917be6dc1ab0f25ef6496e1df363d8a"} Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167253 4781 scope.go:117] "RemoveContainer" containerID="126cdc4a0b2277218672d3ebb95a09f31d56ce658295a2f26c3118300b77f809" Feb 27 00:11:00 crc kubenswrapper[4781]: I0227 00:11:00.167256 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2zhrk" Feb 27 00:11:01 crc kubenswrapper[4781]: I0227 00:11:01.319152 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="456df73e-2d2d-4980-b2e6-9e45d9cd002b" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.340932 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.347816 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.463455 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.853500 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 00:11:05 crc kubenswrapper[4781]: I0227 00:11:05.859814 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.243683 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.374319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.389298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.400163 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.541304 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.543272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.592586 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.633451 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.884454 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 00:11:06 crc kubenswrapper[4781]: I0227 00:11:06.888136 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.127916 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.140329 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.527359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.552854 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.700210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.797956 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.844959 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.885160 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.918769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 00:11:07 crc kubenswrapper[4781]: I0227 00:11:07.930747 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.215846 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.228412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.536563 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.654442 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.777087 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.861663 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.895441 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 00:11:08 crc kubenswrapper[4781]: I0227 00:11:08.978787 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.071396 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.125347 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.382075 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.510582 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.646605 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.883595 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 00:11:09 crc kubenswrapper[4781]: I0227 00:11:09.980460 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.017208 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.051936 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.065120 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.117382 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.132753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.286060 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.286117 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.408914 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.467324 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.544834 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.648975 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.660974 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.677492 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.699985 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.713510 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.754368 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.811090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.873792 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.955086 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 00:11:10 crc kubenswrapper[4781]: I0227 00:11:10.987389 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.114876 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.186401 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.244272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.245039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.326561 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.343978 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.374148 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.492027 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.500149 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.536262 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.601995 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.701983 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.755186 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.807319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.812173 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.844485 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.846778 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.874344 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 00:11:11 crc kubenswrapper[4781]: I0227 00:11:11.931473 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.009849 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.252727 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.345996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.355583 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.473957 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.535798 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.639893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.685213 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.750124 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.799790 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.890329 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 00:11:12 crc kubenswrapper[4781]: I0227 00:11:12.937456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.085817 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.246384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.259946 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.377405 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.382949 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.430009 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.481679 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.495709 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.529830 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.569064 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.581255 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.706909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.723376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.830214 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.851730 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.940687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 00:11:13 crc kubenswrapper[4781]: I0227 00:11:13.953844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.052226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.072239 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.176387 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.197263 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.241605 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.253922 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.277662 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.368898 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.418173 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.438098 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.449076 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.480969 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.527516 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.654311 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.725821 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.750951 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.762119 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.787929 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.814970 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.823609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.835509 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.936743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.942768 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 00:11:14 crc kubenswrapper[4781]: I0227 00:11:14.997064 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.064851 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.066794 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.218052 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.363788 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.431072 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.476198 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.550007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.614417 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.675803 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.740663 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.760136 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.767703 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.935448 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.972681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 00:11:15 crc kubenswrapper[4781]: I0227 00:11:15.983105 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.011884 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.034777 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.048427 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.131436 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.132431 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.209080 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.269507 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.431062 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.481978 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.505800 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.706923 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.772937 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.790698 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 00:11:16 crc kubenswrapper[4781]: I0227 00:11:16.843689 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.015604 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.111746 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.200298 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.318540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.325438 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.397296 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.406322 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.436353 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.480115 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.480539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.538177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.642289 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.698756 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.726999 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.794817 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.803235 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.889057 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 00:11:17 crc kubenswrapper[4781]: I0227 00:11:17.896836 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.136683 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.203456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.345781 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.549484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.675247 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.698244 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.723180 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.727291 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.742041 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.807290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.813731 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.865549 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 00:11:18 crc kubenswrapper[4781]: I0227 00:11:18.915226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.165018 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.208297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.319342 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.476489 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.520309 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.603340 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.628007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.638707 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.670098 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.803947 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.865579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.932272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 00:11:19 crc kubenswrapper[4781]: I0227 00:11:19.944273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.035939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.068046 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.141869 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.146316 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.428499 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.551644 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.729926 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.813465 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.817925 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-2zhrk","openshift-marketplace/certified-operators-52xgq"] Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.817995 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-79d557fb64-sq8tq"] Feb 27 00:11:20 crc kubenswrapper[4781]: E0227 00:11:20.818161 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818174 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: E0227 00:11:20.818190 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818378 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818400 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4018277d-2fc3-40ed-937a-cea43dacb894" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818440 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8795e9-9244-4cc4-a297-3aec68bf3588" containerName="installer" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818458 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" containerName="oauth-openshift" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.818856 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823070 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823443 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823608 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823697 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.823852 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824017 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824139 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824296 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.824613 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826472 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826823 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.826945 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.832553 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.834849 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.841957 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.852214 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.858807 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.858774839 podStartE2EDuration="25.858774839s" podCreationTimestamp="2026-02-27 00:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:11:20.849792887 +0000 UTC m=+350.107332541" watchObservedRunningTime="2026-02-27 00:11:20.858774839 +0000 UTC m=+350.116314433" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.872838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.946809 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954774 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954852 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.954971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955022 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955155 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955552 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:20 crc kubenswrapper[4781]: I0227 00:11:20.955672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.027980 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.056977 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057237 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057312 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.057527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-dir\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.058934 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.059925 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffe89698-729a-4a15-92c3-3a095a00fb26-audit-policies\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.065724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-error\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.066814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.068352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-session\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.069254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.070386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffe89698-729a-4a15-92c3-3a095a00fb26-v4-0-config-user-template-login\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.080402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm28d\" (UniqueName: \"kubernetes.io/projected/ffe89698-729a-4a15-92c3-3a095a00fb26-kube-api-access-cm28d\") pod \"oauth-openshift-79d557fb64-sq8tq\" (UID: \"ffe89698-729a-4a15-92c3-3a095a00fb26\") " pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.119528 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.146843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.257901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.317103 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f286d62-2145-4bbb-91eb-28ffda9b2494" path="/var/lib/kubelet/pods/0f286d62-2145-4bbb-91eb-28ffda9b2494/volumes" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.318273 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b" path="/var/lib/kubelet/pods/cf626ec7-00c1-4ea9-9e8a-1e4a2b66431b/volumes" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.339071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.491984 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.492319 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.611420 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d557fb64-sq8tq"] Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.693507 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.750857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.789699 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.972333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 00:11:21 crc kubenswrapper[4781]: I0227 00:11:21.974144 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.077190 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" event={"ID":"ffe89698-729a-4a15-92c3-3a095a00fb26","Type":"ContainerStarted","Data":"ee4c674b04255d04ca76a5be26bbbed3a63bf79f43764cb7a910fad54e7db94f"} Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" event={"ID":"ffe89698-729a-4a15-92c3-3a095a00fb26","Type":"ContainerStarted","Data":"fd787affcc48f57305115dbaeabfdf2a6725e9721c9f1989d94e4cbcaf0bdefd"} Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.317875 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.318227 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.448579 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.465766 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d557fb64-sq8tq" podStartSLOduration=49.465740896 podStartE2EDuration="49.465740896s" podCreationTimestamp="2026-02-27 00:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:11:22.347475361 +0000 UTC m=+351.605014905" watchObservedRunningTime="2026-02-27 00:11:22.465740896 +0000 UTC m=+351.723280480" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.488540 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.516211 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.530112 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.631929 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.739300 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 00:11:22 crc kubenswrapper[4781]: I0227 00:11:22.924679 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.092493 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.597572 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 00:11:23 crc kubenswrapper[4781]: I0227 00:11:23.886067 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 00:11:24 crc kubenswrapper[4781]: I0227 00:11:24.563962 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 00:11:25 crc kubenswrapper[4781]: I0227 00:11:25.600697 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:11:28 crc kubenswrapper[4781]: I0227 00:11:28.802422 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 00:11:28 crc kubenswrapper[4781]: I0227 00:11:28.804657 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" gracePeriod=5 Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.389124 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.389186 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405094 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405132 4781 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" exitCode=137 Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405170 4781 scope.go:117] "RemoveContainer" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.405270 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.432429 4781 scope.go:117] "RemoveContainer" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: E0227 00:11:34.433077 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": container with ID starting with 2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae not found: ID does not exist" containerID="2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.433139 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae"} err="failed to get container status \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": rpc error: code = NotFound desc = could not find container \"2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae\": container with ID starting with 2eaa3792b6261beb052f90f1385bc3b10531f8155545b8c3f61f792c5f8482ae not found: ID does not exist" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562436 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562578 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.562895 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.564591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565115 4781 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565146 4781 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565167 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.565586 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.577027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:11:34 crc kubenswrapper[4781]: I0227 00:11:34.666853 4781 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:35 crc kubenswrapper[4781]: I0227 00:11:35.321022 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.441481 4781 generic.go:334] "Generic (PLEG): container finished" podID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" exitCode=0 Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.441569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} Feb 27 00:11:39 crc kubenswrapper[4781]: I0227 00:11:39.443132 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.449239 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerStarted","Data":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.450315 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:11:40 crc kubenswrapper[4781]: I0227 00:11:40.452418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.094299 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.095319 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kqrgb" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" containerID="cri-o://ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" gracePeriod=2 Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556373 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerID="ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" exitCode=0 Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1"} Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqrgb" event={"ID":"ac30245d-7e42-440c-99a0-60e2ae15cb8b","Type":"ContainerDied","Data":"ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6"} Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.556817 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba66da6dc8bfa69982da2943397bfec42cd942427662c0a4732f24accf5f77a6" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.557044 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701398 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.701443 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") pod \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\" (UID: \"ac30245d-7e42-440c-99a0-60e2ae15cb8b\") " Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.703042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities" (OuterVolumeSpecName: "utilities") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.709547 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f" (OuterVolumeSpecName: "kube-api-access-8tk7f") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "kube-api-access-8tk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.777571 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac30245d-7e42-440c-99a0-60e2ae15cb8b" (UID: "ac30245d-7e42-440c-99a0-60e2ae15cb8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802577 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tk7f\" (UniqueName: \"kubernetes.io/projected/ac30245d-7e42-440c-99a0-60e2ae15cb8b-kube-api-access-8tk7f\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802604 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:58 crc kubenswrapper[4781]: I0227 00:11:58.802615 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac30245d-7e42-440c-99a0-60e2ae15cb8b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.563534 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqrgb" Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.589096 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:11:59 crc kubenswrapper[4781]: I0227 00:11:59.597510 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kqrgb"] Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.165476 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166224 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166261 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166299 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-utilities" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166315 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-utilities" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166337 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-content" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166351 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="extract-content" Feb 27 00:12:00 crc kubenswrapper[4781]: E0227 00:12:00.166375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166389 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166593 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" containerName="registry-server" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.166620 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.167588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174509 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.174694 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.176013 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.345360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.446493 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.469379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"auto-csr-approver-29535852-49dfn\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.486043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:00 crc kubenswrapper[4781]: I0227 00:12:00.921490 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:12:01 crc kubenswrapper[4781]: I0227 00:12:01.317740 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac30245d-7e42-440c-99a0-60e2ae15cb8b" path="/var/lib/kubelet/pods/ac30245d-7e42-440c-99a0-60e2ae15cb8b/volumes" Feb 27 00:12:01 crc kubenswrapper[4781]: I0227 00:12:01.581084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerStarted","Data":"8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292"} Feb 27 00:12:02 crc kubenswrapper[4781]: I0227 00:12:02.587433 4781 generic.go:334] "Generic (PLEG): container finished" podID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerID="5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4" exitCode=0 Feb 27 00:12:02 crc kubenswrapper[4781]: I0227 00:12:02.587811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerDied","Data":"5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4"} Feb 27 00:12:03 crc kubenswrapper[4781]: I0227 00:12:03.994463 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.097212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") pod \"96ecbd6e-c579-40ca-a5bf-9876777721f9\" (UID: \"96ecbd6e-c579-40ca-a5bf-9876777721f9\") " Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.118067 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k" (OuterVolumeSpecName: "kube-api-access-8gq6k") pod "96ecbd6e-c579-40ca-a5bf-9876777721f9" (UID: "96ecbd6e-c579-40ca-a5bf-9876777721f9"). InnerVolumeSpecName "kube-api-access-8gq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.199157 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gq6k\" (UniqueName: \"kubernetes.io/projected/96ecbd6e-c579-40ca-a5bf-9876777721f9-kube-api-access-8gq6k\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.620326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535852-49dfn" event={"ID":"96ecbd6e-c579-40ca-a5bf-9876777721f9","Type":"ContainerDied","Data":"8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292"} Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.621080 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9ee590d475e3013eb4d1b9376a1cb32e7ad41ac823cc3a34078edf0741d292" Feb 27 00:12:04 crc kubenswrapper[4781]: I0227 00:12:04.621235 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535852-49dfn" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.083179 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.083970 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kztqg" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" containerID="cri-o://efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.099832 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.100268 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-42hbx" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" containerID="cri-o://22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.115957 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.116291 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" containerID="cri-o://d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.134074 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.134338 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9ngbg" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" containerID="cri-o://282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.147701 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.147964 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hcdz5" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" containerID="cri-o://170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" gracePeriod=30 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151236 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.151531 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151549 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.151675 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" containerName="oc" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.152019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.154715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184647 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.184886 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.285892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.285947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.286018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.287173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/672e121e-2b7f-4454-b628-d99032669167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.291182 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/672e121e-2b7f-4454-b628-d99032669167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.321583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxd2b\" (UniqueName: \"kubernetes.io/projected/672e121e-2b7f-4454-b628-d99032669167-kube-api-access-bxd2b\") pod \"marketplace-operator-79b997595-h5lrz\" (UID: \"672e121e-2b7f-4454-b628-d99032669167\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.493783 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.500507 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.615477 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.625016 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.661114 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.679535 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.690989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.691061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.691146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") pod \"19ed5401-2778-4266-8bf1-1c7244dac100\" (UID: \"19ed5401-2778-4266-8bf1-1c7244dac100\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.693727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities" (OuterVolumeSpecName: "utilities") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.697546 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh" (OuterVolumeSpecName: "kube-api-access-xgndh") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "kube-api-access-xgndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709221 4781 generic.go:334] "Generic (PLEG): container finished" podID="19ed5401-2778-4266-8bf1-1c7244dac100" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-42hbx" event={"ID":"19ed5401-2778-4266-8bf1-1c7244dac100","Type":"ContainerDied","Data":"78b3df3f6b7f7425a9c2cd10f5b420e9f36ecb616bd533d5cfdfee3767475ccc"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709322 4781 scope.go:117] "RemoveContainer" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.709426 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-42hbx" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715410 4781 generic.go:334] "Generic (PLEG): container finished" podID="baa593f3-06c4-461f-a893-609b07dfd282" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715496 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715529 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ngbg" event={"ID":"baa593f3-06c4-461f-a893-609b07dfd282","Type":"ContainerDied","Data":"9502c5ad99503e1096d0070d626245f0844a912a2ffc6a125931ca6764817da5"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.715557 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ngbg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718136 4781 generic.go:334] "Generic (PLEG): container finished" podID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718321 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.718659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgpv7" event={"ID":"6dc17f1d-c1f4-43b9-9291-7c32c6804d44","Type":"ContainerDied","Data":"eb45173a1f629c7ad2883098f5964e4563b43bb7bdca30eb6fc3bc6e2ce93911"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724281 4781 generic.go:334] "Generic (PLEG): container finished" podID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724372 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcdz5" event={"ID":"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0","Type":"ContainerDied","Data":"6d980a6fc9de180882f2ee8cc193af0d7ab5d1ba875bfb8da4f55cc14f767f69"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.724430 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcdz5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729738 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" exitCode=0 Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kztqg" event={"ID":"2b050e9e-d6c8-4e27-ad3f-9681553c1539","Type":"ContainerDied","Data":"f682c737bcb211243a2988ca17e566ea00c7e2d14bf78fba6f612945a62f66e6"} Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.729834 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kztqg" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.733475 4781 scope.go:117] "RemoveContainer" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.761154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19ed5401-2778-4266-8bf1-1c7244dac100" (UID: "19ed5401-2778-4266-8bf1-1c7244dac100"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.763882 4781 scope.go:117] "RemoveContainer" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.787546 4781 scope.go:117] "RemoveContainer" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788181 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": container with ID starting with 22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce not found: ID does not exist" containerID="22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788227 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce"} err="failed to get container status \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": rpc error: code = NotFound desc = could not find container \"22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce\": container with ID starting with 22354d7c75573a2d5595bd3f22870bfd400c62f8b630846cdf77af9fd324f7ce not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788257 4781 scope.go:117] "RemoveContainer" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788531 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": container with ID starting with f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c not found: ID does not exist" containerID="f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788554 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c"} err="failed to get container status \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": rpc error: code = NotFound desc = could not find container \"f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c\": container with ID starting with f154c29cb3bee2cd302922fe50110d556829b9ccb083957d00548536956edb1c not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788569 4781 scope.go:117] "RemoveContainer" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.788915 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": container with ID starting with 064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a not found: ID does not exist" containerID="064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788934 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a"} err="failed to get container status \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": rpc error: code = NotFound desc = could not find container \"064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a\": container with ID starting with 064248c42fab27054d637381f461db69b3ccaabc4e80c84639f005d76424769a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.788947 4781 scope.go:117] "RemoveContainer" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793579 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") pod \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\" (UID: \"a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793782 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793853 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793974 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") pod \"baa593f3-06c4-461f-a893-609b07dfd282\" (UID: \"baa593f3-06c4-461f-a893-609b07dfd282\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.793994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") pod \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\" (UID: \"6dc17f1d-c1f4-43b9-9291-7c32c6804d44\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") pod \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\" (UID: \"2b050e9e-d6c8-4e27-ad3f-9681553c1539\") " Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities" (OuterVolumeSpecName: "utilities") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794326 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.794482 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities" (OuterVolumeSpecName: "utilities") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795021 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgndh\" (UniqueName: \"kubernetes.io/projected/19ed5401-2778-4266-8bf1-1c7244dac100-kube-api-access-xgndh\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795052 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795061 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ed5401-2778-4266-8bf1-1c7244dac100-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.795852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities" (OuterVolumeSpecName: "utilities") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.797713 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw" (OuterVolumeSpecName: "kube-api-access-zpnxw") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "kube-api-access-zpnxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.799759 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2" (OuterVolumeSpecName: "kube-api-access-9mqv2") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "kube-api-access-9mqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.805409 4781 scope.go:117] "RemoveContainer" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.805689 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs" (OuterVolumeSpecName: "kube-api-access-mw4cs") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "kube-api-access-mw4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.808776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm" (OuterVolumeSpecName: "kube-api-access-ztvqm") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "kube-api-access-ztvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.809022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6dc17f1d-c1f4-43b9-9291-7c32c6804d44" (UID: "6dc17f1d-c1f4-43b9-9291-7c32c6804d44"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.821373 4781 scope.go:117] "RemoveContainer" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.828066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "baa593f3-06c4-461f-a893-609b07dfd282" (UID: "baa593f3-06c4-461f-a893-609b07dfd282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.832431 4781 scope.go:117] "RemoveContainer" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.833929 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": container with ID starting with 282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5 not found: ID does not exist" containerID="282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834018 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5"} err="failed to get container status \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": rpc error: code = NotFound desc = could not find container \"282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5\": container with ID starting with 282fc96c8ce87017d7802dd18ea9155171e4759e3a94a26367d5ecb97c6d0ec5 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834063 4781 scope.go:117] "RemoveContainer" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.834610 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": container with ID starting with c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f not found: ID does not exist" containerID="c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834656 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f"} err="failed to get container status \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": rpc error: code = NotFound desc = could not find container \"c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f\": container with ID starting with c38378a09cf479f3808a13dfe25b0d67a7d8d45572911fa1d6158aba35ba104f not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.834681 4781 scope.go:117] "RemoveContainer" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.835099 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": container with ID starting with eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91 not found: ID does not exist" containerID="eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.835130 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91"} err="failed to get container status \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": rpc error: code = NotFound desc = could not find container \"eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91\": container with ID starting with eacb00f064c2ddecdfb7b28e6b61c79e43e456c6031a0ac2b2b97148358dbc91 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.835151 4781 scope.go:117] "RemoveContainer" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.849927 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.859684 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b050e9e-d6c8-4e27-ad3f-9681553c1539" (UID: "2b050e9e-d6c8-4e27-ad3f-9681553c1539"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.864313 4781 scope.go:117] "RemoveContainer" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.865257 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": container with ID starting with d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1 not found: ID does not exist" containerID="d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.865297 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1"} err="failed to get container status \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": rpc error: code = NotFound desc = could not find container \"d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1\": container with ID starting with d1d45f8ef9075f03107936fcf9c1b0c723f0771b6b8a40100b337a81ed99ffd1 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.865346 4781 scope.go:117] "RemoveContainer" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.866457 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": container with ID starting with ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e not found: ID does not exist" containerID="ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.866576 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e"} err="failed to get container status \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": rpc error: code = NotFound desc = could not find container \"ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e\": container with ID starting with ce3b476a42a9f3da02bf9f50b03dcf8217bc6886e083283c43ecded5c29ff43e not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.866662 4781 scope.go:117] "RemoveContainer" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.880317 4781 scope.go:117] "RemoveContainer" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896338 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztvqm\" (UniqueName: \"kubernetes.io/projected/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-kube-api-access-ztvqm\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896383 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896399 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa593f3-06c4-461f-a893-609b07dfd282-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896412 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b050e9e-d6c8-4e27-ad3f-9681553c1539-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896427 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw4cs\" (UniqueName: \"kubernetes.io/projected/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-kube-api-access-mw4cs\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896440 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896454 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mqv2\" (UniqueName: \"kubernetes.io/projected/baa593f3-06c4-461f-a893-609b07dfd282-kube-api-access-9mqv2\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896467 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dc17f1d-c1f4-43b9-9291-7c32c6804d44-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896479 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpnxw\" (UniqueName: \"kubernetes.io/projected/2b050e9e-d6c8-4e27-ad3f-9681553c1539-kube-api-access-zpnxw\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.896490 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.898828 4781 scope.go:117] "RemoveContainer" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.909936 4781 scope.go:117] "RemoveContainer" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.910327 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": container with ID starting with 170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3 not found: ID does not exist" containerID="170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910368 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3"} err="failed to get container status \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": rpc error: code = NotFound desc = could not find container \"170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3\": container with ID starting with 170f4f45b93514562c4db75daba0bc008f77b1104896d8febf847761526ee3f3 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910395 4781 scope.go:117] "RemoveContainer" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.910852 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": container with ID starting with a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093 not found: ID does not exist" containerID="a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910901 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093"} err="failed to get container status \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": rpc error: code = NotFound desc = could not find container \"a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093\": container with ID starting with a025dd078f729ded0bf6d545799c77a77bef28096df89b8ae88c1133a03f1093 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.910935 4781 scope.go:117] "RemoveContainer" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.911241 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": container with ID starting with a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a not found: ID does not exist" containerID="a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.911269 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a"} err="failed to get container status \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": rpc error: code = NotFound desc = could not find container \"a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a\": container with ID starting with a1770e607d622ea169288727b8ca0b7e8e1a65320533312b8c776b1e821a295a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.911286 4781 scope.go:117] "RemoveContainer" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.923790 4781 scope.go:117] "RemoveContainer" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.936741 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" (UID: "a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.949686 4781 scope.go:117] "RemoveContainer" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967214 4781 scope.go:117] "RemoveContainer" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.967581 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": container with ID starting with efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a not found: ID does not exist" containerID="efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967615 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a"} err="failed to get container status \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": rpc error: code = NotFound desc = could not find container \"efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a\": container with ID starting with efb81711e0a5a335934438d3b0efa6534c86c0cbe83ca3ff2213393cf6293c2a not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967650 4781 scope.go:117] "RemoveContainer" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5lrz"] Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.967942 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": container with ID starting with 254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8 not found: ID does not exist" containerID="254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967968 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8"} err="failed to get container status \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": rpc error: code = NotFound desc = could not find container \"254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8\": container with ID starting with 254d4c5bba2fd28d3a63e1836566567a36a209ed04e850df387dd2dfb34874d8 not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.967984 4781 scope.go:117] "RemoveContainer" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: E0227 00:12:16.968269 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": container with ID starting with d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd not found: ID does not exist" containerID="d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.968291 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd"} err="failed to get container status \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": rpc error: code = NotFound desc = could not find container \"d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd\": container with ID starting with d026f2efdb04e4237d23ee456b6f74e1a9fa53cf969148b770c424344e591fbd not found: ID does not exist" Feb 27 00:12:16 crc kubenswrapper[4781]: I0227 00:12:16.997473 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.042265 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.047141 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-42hbx"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.067599 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.072802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ngbg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.081650 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.086688 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hcdz5"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.102196 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.106052 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kztqg"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.108807 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.112053 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgpv7"] Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.315720 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" path="/var/lib/kubelet/pods/19ed5401-2778-4266-8bf1-1c7244dac100/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.316619 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" path="/var/lib/kubelet/pods/2b050e9e-d6c8-4e27-ad3f-9681553c1539/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.317370 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" path="/var/lib/kubelet/pods/6dc17f1d-c1f4-43b9-9291-7c32c6804d44/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.318513 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" path="/var/lib/kubelet/pods/a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.319210 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa593f3-06c4-461f-a893-609b07dfd282" path="/var/lib/kubelet/pods/baa593f3-06c4-461f-a893-609b07dfd282/volumes" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737560 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" event={"ID":"672e121e-2b7f-4454-b628-d99032669167","Type":"ContainerStarted","Data":"0ae642073089998589b3411622ec45694b1a13d8c7760ac9204b57878538de54"} Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737614 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" event={"ID":"672e121e-2b7f-4454-b628-d99032669167","Type":"ContainerStarted","Data":"6aba3392a6bb9f45d6db84c5245963a2d13b5bb8600834bc595b81ced0fcb847"} Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.737851 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.743587 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" Feb 27 00:12:17 crc kubenswrapper[4781]: I0227 00:12:17.757541 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h5lrz" podStartSLOduration=1.757516319 podStartE2EDuration="1.757516319s" podCreationTimestamp="2026-02-27 00:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:12:17.756914723 +0000 UTC m=+407.014454277" watchObservedRunningTime="2026-02-27 00:12:17.757516319 +0000 UTC m=+407.015055903" Feb 27 00:12:42 crc kubenswrapper[4781]: I0227 00:12:42.894982 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:12:42 crc kubenswrapper[4781]: I0227 00:12:42.895712 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.096611 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097032 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097044 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097054 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097067 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097073 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097086 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097092 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097099 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097104 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097113 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097119 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097126 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097132 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-utilities" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097142 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097148 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097154 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097159 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097168 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097173 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097185 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097190 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="extract-content" Feb 27 00:12:46 crc kubenswrapper[4781]: E0227 00:12:46.097212 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097217 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097291 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b050e9e-d6c8-4e27-ad3f-9681553c1539" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097301 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa593f3-06c4-461f-a893-609b07dfd282" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097312 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097319 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5cfeb4a-82c0-4a43-9a49-aab47f29aaa0" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097327 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ed5401-2778-4266-8bf1-1c7244dac100" containerName="registry-server" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097334 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.097694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.109087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211931 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.211955 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212081 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.212105 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.231029 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.313959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314404 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9214145c-17df-4f6a-9d5d-fa488256bf24-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.314879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.315022 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-trusted-ca\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.316171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-certificates\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.319734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9214145c-17df-4f6a-9d5d-fa488256bf24-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.327873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-registry-tls\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.332484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-bound-sa-token\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.337051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhsgl\" (UniqueName: \"kubernetes.io/projected/9214145c-17df-4f6a-9d5d-fa488256bf24-kube-api-access-xhsgl\") pod \"image-registry-66df7c8f76-kdd5s\" (UID: \"9214145c-17df-4f6a-9d5d-fa488256bf24\") " pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.413984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.646260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kdd5s"] Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" event={"ID":"9214145c-17df-4f6a-9d5d-fa488256bf24","Type":"ContainerStarted","Data":"b106ff048bebda9206279c84d9b98418e2ff640bd6d31039c45d27053e3ab869"} Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" event={"ID":"9214145c-17df-4f6a-9d5d-fa488256bf24","Type":"ContainerStarted","Data":"10a05872b968a5c227876b0f31b8cab287f5dfa365e8e2186ce3887a6f9f2774"} Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.927945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:12:46 crc kubenswrapper[4781]: I0227 00:12:46.951801 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" podStartSLOduration=0.951783284 podStartE2EDuration="951.783284ms" podCreationTimestamp="2026-02-27 00:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:12:46.949060493 +0000 UTC m=+436.206600037" watchObservedRunningTime="2026-02-27 00:12:46.951783284 +0000 UTC m=+436.209322848" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.114399 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: E0227 00:12:47.115035 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.115051 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc17f1d-c1f4-43b9-9291-7c32c6804d44" containerName="marketplace-operator" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.118531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.120514 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.122877 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228061 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.228548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.314871 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.315989 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.318321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.327220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.329490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-catalog-content\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.330784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.331033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9df096-6538-4b50-8536-bfdd5474eece-utilities\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.354503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2t2x\" (UniqueName: \"kubernetes.io/projected/dc9df096-6538-4b50-8536-bfdd5474eece-kube-api-access-r2t2x\") pod \"redhat-operators-sp6x7\" (UID: \"dc9df096-6538-4b50-8536-bfdd5474eece\") " pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.432242 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.436921 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534805 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.534846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.535549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-utilities\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.536466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0f47-560e-4d1a-8414-b65b1a159c68-catalog-content\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.557685 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvw2\" (UniqueName: \"kubernetes.io/projected/1b6e0f47-560e-4d1a-8414-b65b1a159c68-kube-api-access-8rvw2\") pod \"redhat-marketplace-sw9sn\" (UID: \"1b6e0f47-560e-4d1a-8414-b65b1a159c68\") " pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.649364 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.840473 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sp6x7"] Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.855177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9sn"] Feb 27 00:12:47 crc kubenswrapper[4781]: W0227 00:12:47.860387 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6e0f47_560e_4d1a_8414_b65b1a159c68.slice/crio-be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451 WatchSource:0}: Error finding container be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451: Status 404 returned error can't find the container with id be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451 Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.934696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"0e534f4895fd12f31a5892b669b3034b19ea81c3dce0ed8633a17ea5fbab9974"} Feb 27 00:12:47 crc kubenswrapper[4781]: I0227 00:12:47.935729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerStarted","Data":"be41cdab5afb116c2dd348da76c4c90ffdc4f2598752927b7cd53ef460842451"} Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.941917 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0f47-560e-4d1a-8414-b65b1a159c68" containerID="ec58bc6bc7b876505ea1296e1abc01cb884fc82e5f593fe07e8246a1dd8a35cf" exitCode=0 Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.942087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerDied","Data":"ec58bc6bc7b876505ea1296e1abc01cb884fc82e5f593fe07e8246a1dd8a35cf"} Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.944382 4781 generic.go:334] "Generic (PLEG): container finished" podID="dc9df096-6538-4b50-8536-bfdd5474eece" containerID="cf9515ff8da586abfc861e302536bce5f9ffd87a480960cd1dba5923515e72f5" exitCode=0 Feb 27 00:12:48 crc kubenswrapper[4781]: I0227 00:12:48.944409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerDied","Data":"cf9515ff8da586abfc861e302536bce5f9ffd87a480960cd1dba5923515e72f5"} Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.512459 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.515483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.517841 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.526341 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.575490 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677044 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677206 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.677965 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-catalog-content\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.678067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ef2a1c8-c174-456d-adff-2693b022fa83-utilities\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.696524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47g7t\" (UniqueName: \"kubernetes.io/projected/5ef2a1c8-c174-456d-adff-2693b022fa83-kube-api-access-47g7t\") pod \"community-operators-pjpww\" (UID: \"5ef2a1c8-c174-456d-adff-2693b022fa83\") " pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.709969 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.711309 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.713529 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.720857 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778083 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.778112 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.838501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.879394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.880009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-utilities\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.882453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-catalog-content\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.899445 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwc74\" (UniqueName: \"kubernetes.io/projected/9186313b-02fa-4d6f-9394-ab05a9e3d7d4-kube-api-access-wwc74\") pod \"certified-operators-kpswm\" (UID: \"9186313b-02fa-4d6f-9394-ab05a9e3d7d4\") " pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.955646 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0f47-560e-4d1a-8414-b65b1a159c68" containerID="bc56036947f389c30da6f4d6cf65a2a2aa23b50e0d86c6420c1f6da94739bf2c" exitCode=0 Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.955938 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerDied","Data":"bc56036947f389c30da6f4d6cf65a2a2aa23b50e0d86c6420c1f6da94739bf2c"} Feb 27 00:12:49 crc kubenswrapper[4781]: I0227 00:12:49.959465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.066443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjpww"] Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.068785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:12:50 crc kubenswrapper[4781]: W0227 00:12:50.078006 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef2a1c8_c174_456d_adff_2693b022fa83.slice/crio-1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e WatchSource:0}: Error finding container 1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e: Status 404 returned error can't find the container with id 1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.882499 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpswm"] Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.966562 4781 generic.go:334] "Generic (PLEG): container finished" podID="dc9df096-6538-4b50-8536-bfdd5474eece" containerID="c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c" exitCode=0 Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.966778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerDied","Data":"c7fead24172fe5cc2930ab204e9faa584280644447a9f1c3406e8bebb16e2a9c"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.977402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9sn" event={"ID":"1b6e0f47-560e-4d1a-8414-b65b1a159c68","Type":"ContainerStarted","Data":"56a11ee178e49c4a68132eea8bbfe49daa76c637a8b8ff7ab08d3049ae4223cf"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.978969 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ef2a1c8-c174-456d-adff-2693b022fa83" containerID="a30737fec7fa0dc292ac37010102da44f94592d0de29f83e0d51c4626c4936b5" exitCode=0 Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.979173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerDied","Data":"a30737fec7fa0dc292ac37010102da44f94592d0de29f83e0d51c4626c4936b5"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.979481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"1339fbb9ae8ae1924a262446b32a911cfb2db8083af9a1a8e3ffcecc9410f70e"} Feb 27 00:12:50 crc kubenswrapper[4781]: I0227 00:12:50.980668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"8f604256dd01c9824287221d4e6e219f5bc3aa3b917573f0f5924b28f43596d5"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.000838 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw9sn" podStartSLOduration=2.439722538 podStartE2EDuration="4.000821031s" podCreationTimestamp="2026-02-27 00:12:47 +0000 UTC" firstStartedPulling="2026-02-27 00:12:48.94387004 +0000 UTC m=+438.201409594" lastFinishedPulling="2026-02-27 00:12:50.504968533 +0000 UTC m=+439.762508087" observedRunningTime="2026-02-27 00:12:50.999812075 +0000 UTC m=+440.257351639" watchObservedRunningTime="2026-02-27 00:12:51.000821031 +0000 UTC m=+440.258360595" Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.987445 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sp6x7" event={"ID":"dc9df096-6538-4b50-8536-bfdd5474eece","Type":"ContainerStarted","Data":"fd28bbc68e1ee130d63eaad06113a56225923cae8547a7c0754198026c6d7375"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.988933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26"} Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.990554 4781 generic.go:334] "Generic (PLEG): container finished" podID="9186313b-02fa-4d6f-9394-ab05a9e3d7d4" containerID="b7804c15b27555ed46326a0e5f53b71aa843ba5e64ce55c39d75f8df65107fd4" exitCode=0 Feb 27 00:12:51 crc kubenswrapper[4781]: I0227 00:12:51.990659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerDied","Data":"b7804c15b27555ed46326a0e5f53b71aa843ba5e64ce55c39d75f8df65107fd4"} Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.008018 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sp6x7" podStartSLOduration=2.533475212 podStartE2EDuration="5.008003268s" podCreationTimestamp="2026-02-27 00:12:47 +0000 UTC" firstStartedPulling="2026-02-27 00:12:48.945491252 +0000 UTC m=+438.203030806" lastFinishedPulling="2026-02-27 00:12:51.420019318 +0000 UTC m=+440.677558862" observedRunningTime="2026-02-27 00:12:52.00464226 +0000 UTC m=+441.262181824" watchObservedRunningTime="2026-02-27 00:12:52.008003268 +0000 UTC m=+441.265542812" Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.997306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0"} Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.999416 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ef2a1c8-c174-456d-adff-2693b022fa83" containerID="a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26" exitCode=0 Feb 27 00:12:52 crc kubenswrapper[4781]: I0227 00:12:52.999489 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerDied","Data":"a3624f1dbc87dc2a10a15bacc09efc78fd1d0a601f477d1fadb684f3c3187f26"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.005491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjpww" event={"ID":"5ef2a1c8-c174-456d-adff-2693b022fa83","Type":"ContainerStarted","Data":"b14ff0d374bd72b5afdaa3e58794be865fb0037c254f022dfd680b12fd3232b3"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.009816 4781 generic.go:334] "Generic (PLEG): container finished" podID="9186313b-02fa-4d6f-9394-ab05a9e3d7d4" containerID="90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0" exitCode=0 Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.009851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerDied","Data":"90585b6b5a7eeaf711facd15133df6ac4b7f831dd2a88a4db8bb5ac6f34036f0"} Feb 27 00:12:54 crc kubenswrapper[4781]: I0227 00:12:54.027365 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pjpww" podStartSLOduration=2.23990224 podStartE2EDuration="5.027347892s" podCreationTimestamp="2026-02-27 00:12:49 +0000 UTC" firstStartedPulling="2026-02-27 00:12:50.97999696 +0000 UTC m=+440.237536514" lastFinishedPulling="2026-02-27 00:12:53.767442622 +0000 UTC m=+443.024982166" observedRunningTime="2026-02-27 00:12:54.024985971 +0000 UTC m=+443.282525525" watchObservedRunningTime="2026-02-27 00:12:54.027347892 +0000 UTC m=+443.284887456" Feb 27 00:12:55 crc kubenswrapper[4781]: I0227 00:12:55.017991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpswm" event={"ID":"9186313b-02fa-4d6f-9394-ab05a9e3d7d4","Type":"ContainerStarted","Data":"2d99ae75d86c387a8e30e3e603db79600d116cb08d25683fcb2c54c22a28f8a6"} Feb 27 00:12:55 crc kubenswrapper[4781]: I0227 00:12:55.043051 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kpswm" podStartSLOduration=3.466674029 podStartE2EDuration="6.04303063s" podCreationTimestamp="2026-02-27 00:12:49 +0000 UTC" firstStartedPulling="2026-02-27 00:12:51.991574021 +0000 UTC m=+441.249113575" lastFinishedPulling="2026-02-27 00:12:54.567930612 +0000 UTC m=+443.825470176" observedRunningTime="2026-02-27 00:12:55.037365713 +0000 UTC m=+444.294905277" watchObservedRunningTime="2026-02-27 00:12:55.04303063 +0000 UTC m=+444.300570214" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.437608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.438869 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.504474 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.650178 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.650737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:57 crc kubenswrapper[4781]: I0227 00:12:57.693556 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:58 crc kubenswrapper[4781]: I0227 00:12:58.076509 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sp6x7" Feb 27 00:12:58 crc kubenswrapper[4781]: I0227 00:12:58.102702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw9sn" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.839418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.839671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:12:59 crc kubenswrapper[4781]: I0227 00:12:59.902773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.069532 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.069581 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.092609 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pjpww" Feb 27 00:13:00 crc kubenswrapper[4781]: I0227 00:13:00.123702 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:01 crc kubenswrapper[4781]: I0227 00:13:01.090870 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kpswm" Feb 27 00:13:06 crc kubenswrapper[4781]: I0227 00:13:06.425600 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-kdd5s" Feb 27 00:13:06 crc kubenswrapper[4781]: I0227 00:13:06.517757 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:12 crc kubenswrapper[4781]: I0227 00:13:12.895052 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:13:12 crc kubenswrapper[4781]: I0227 00:13:12.895827 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:13:31 crc kubenswrapper[4781]: I0227 00:13:31.569477 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" containerID="cri-o://c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" gracePeriod=30 Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.034048 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.159878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160058 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160144 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160289 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.160324 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") pod \"16339491-baee-42b5-82bb-07bca82a5f77\" (UID: \"16339491-baee-42b5-82bb-07bca82a5f77\") " Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.162089 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.164264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.169853 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170249 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v" (OuterVolumeSpecName: "kube-api-access-fwd7v") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "kube-api-access-fwd7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170673 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.170733 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.181338 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "16339491-baee-42b5-82bb-07bca82a5f77" (UID: "16339491-baee-42b5-82bb-07bca82a5f77"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235201 4781 generic.go:334] "Generic (PLEG): container finished" podID="16339491-baee-42b5-82bb-07bca82a5f77" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" exitCode=0 Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerDied","Data":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" event={"ID":"16339491-baee-42b5-82bb-07bca82a5f77","Type":"ContainerDied","Data":"baa2ed7e45a407c61fcadf3b6fb1abb2bf58b2f1863ead5f5bd18f0e92393602"} Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235275 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tw95c" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.235304 4781 scope.go:117] "RemoveContainer" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261843 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261891 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/16339491-baee-42b5-82bb-07bca82a5f77-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261908 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261922 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16339491-baee-42b5-82bb-07bca82a5f77-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261941 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/16339491-baee-42b5-82bb-07bca82a5f77-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261956 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwd7v\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-kube-api-access-fwd7v\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.261972 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16339491-baee-42b5-82bb-07bca82a5f77-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.270095 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.272164 4781 scope.go:117] "RemoveContainer" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: E0227 00:13:32.272806 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": container with ID starting with c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077 not found: ID does not exist" containerID="c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.272833 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077"} err="failed to get container status \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": rpc error: code = NotFound desc = could not find container \"c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077\": container with ID starting with c0a9479d46d082558311a372af09df12d1f797c7cdde2ce644f1af261b389077 not found: ID does not exist" Feb 27 00:13:32 crc kubenswrapper[4781]: I0227 00:13:32.275424 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tw95c"] Feb 27 00:13:33 crc kubenswrapper[4781]: I0227 00:13:33.317343 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16339491-baee-42b5-82bb-07bca82a5f77" path="/var/lib/kubelet/pods/16339491-baee-42b5-82bb-07bca82a5f77/volumes" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.895872 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.896534 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.896596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.897285 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:13:42 crc kubenswrapper[4781]: I0227 00:13:42.897357 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" gracePeriod=600 Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.311017 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" exitCode=0 Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc"} Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} Feb 27 00:13:43 crc kubenswrapper[4781]: I0227 00:13:43.314855 4781 scope.go:117] "RemoveContainer" containerID="f5be70a2916213c961759992806ae032decbc8c1382f7d82de2a5da221aee089" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.133198 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:00 crc kubenswrapper[4781]: E0227 00:14:00.134001 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134017 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134146 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="16339491-baee-42b5-82bb-07bca82a5f77" containerName="registry" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.134970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136770 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136806 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.136916 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.138143 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.327724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.428288 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:00 crc kubenswrapper[4781]: I0227 00:14:00.451930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"auto-csr-approver-29535854-lplm8\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.066124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.263866 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:14:01 crc kubenswrapper[4781]: I0227 00:14:01.428885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerStarted","Data":"ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea"} Feb 27 00:14:03 crc kubenswrapper[4781]: I0227 00:14:03.440773 4781 generic.go:334] "Generic (PLEG): container finished" podID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerID="7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a" exitCode=0 Feb 27 00:14:03 crc kubenswrapper[4781]: I0227 00:14:03.440835 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerDied","Data":"7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a"} Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.666881 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.778425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") pod \"d2676f22-56e0-46ed-83d0-4d29fc704155\" (UID: \"d2676f22-56e0-46ed-83d0-4d29fc704155\") " Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.785917 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd" (OuterVolumeSpecName: "kube-api-access-5tthd") pod "d2676f22-56e0-46ed-83d0-4d29fc704155" (UID: "d2676f22-56e0-46ed-83d0-4d29fc704155"). InnerVolumeSpecName "kube-api-access-5tthd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:14:04 crc kubenswrapper[4781]: I0227 00:14:04.880130 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tthd\" (UniqueName: \"kubernetes.io/projected/d2676f22-56e0-46ed-83d0-4d29fc704155-kube-api-access-5tthd\") on node \"crc\" DevicePath \"\"" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535854-lplm8" event={"ID":"d2676f22-56e0-46ed-83d0-4d29fc704155","Type":"ContainerDied","Data":"ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea"} Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452742 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede374a6e9d1e538be7d431f4a287918634521cca633711915570e82f4a64bea" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.452775 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535854-lplm8" Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.727228 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:14:05 crc kubenswrapper[4781]: I0227 00:14:05.735670 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535848-ccctv"] Feb 27 00:14:07 crc kubenswrapper[4781]: I0227 00:14:07.316961 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df035290-8e3c-422b-90ac-573b592defcf" path="/var/lib/kubelet/pods/df035290-8e3c-422b-90ac-573b592defcf/volumes" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.161338 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: E0227 00:15:00.162239 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162255 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162378 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" containerName="oc" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.162936 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.165679 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.165771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.176284 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.295720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.295938 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.296034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.398224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.399664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.411075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.421126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"collect-profiles-29535855-hf24h\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.490123 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.737283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 00:15:00 crc kubenswrapper[4781]: I0227 00:15:00.864399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerStarted","Data":"2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78"} Feb 27 00:15:01 crc kubenswrapper[4781]: I0227 00:15:01.875094 4781 generic.go:334] "Generic (PLEG): container finished" podID="22b44418-6039-4859-96ba-1442e52b290e" containerID="1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf" exitCode=0 Feb 27 00:15:01 crc kubenswrapper[4781]: I0227 00:15:01.875140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerDied","Data":"1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf"} Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.117839 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236233 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.236362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") pod \"22b44418-6039-4859-96ba-1442e52b290e\" (UID: \"22b44418-6039-4859-96ba-1442e52b290e\") " Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.237658 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume" (OuterVolumeSpecName: "config-volume") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.242522 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.242859 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p" (OuterVolumeSpecName: "kube-api-access-htw5p") pod "22b44418-6039-4859-96ba-1442e52b290e" (UID: "22b44418-6039-4859-96ba-1442e52b290e"). InnerVolumeSpecName "kube-api-access-htw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338212 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b44418-6039-4859-96ba-1442e52b290e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338240 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b44418-6039-4859-96ba-1442e52b290e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.338250 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htw5p\" (UniqueName: \"kubernetes.io/projected/22b44418-6039-4859-96ba-1442e52b290e-kube-api-access-htw5p\") on node \"crc\" DevicePath \"\"" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890388 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" event={"ID":"22b44418-6039-4859-96ba-1442e52b290e","Type":"ContainerDied","Data":"2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78"} Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890447 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0f59f7e2c9a2e9d7f4c88bbf3022043ccf3ca7386d743bd5e15d7fdd6bdd78" Feb 27 00:15:03 crc kubenswrapper[4781]: I0227 00:15:03.890466 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h" Feb 27 00:15:48 crc kubenswrapper[4781]: I0227 00:15:48.625207 4781 scope.go:117] "RemoveContainer" containerID="0c5e0439f18997d1945f8c92f69edded31054471dc31175a4e23307895e84fc9" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.149242 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: E0227 00:16:00.150174 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150191 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150316 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b44418-6039-4859-96ba-1442e52b290e" containerName="collect-profiles" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.150756 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153205 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153729 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.153870 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.159088 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.229731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.330355 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.364003 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"auto-csr-approver-29535856-mznwl\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.479272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.739868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:16:00 crc kubenswrapper[4781]: W0227 00:16:00.747831 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod778d83b2_2e0c_45b3_a296_aaba355c6427.slice/crio-9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc WatchSource:0}: Error finding container 9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc: Status 404 returned error can't find the container with id 9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc Feb 27 00:16:00 crc kubenswrapper[4781]: I0227 00:16:00.750216 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:16:01 crc kubenswrapper[4781]: I0227 00:16:01.292441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerStarted","Data":"9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc"} Feb 27 00:16:02 crc kubenswrapper[4781]: I0227 00:16:02.299591 4781 generic.go:334] "Generic (PLEG): container finished" podID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerID="96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5" exitCode=0 Feb 27 00:16:02 crc kubenswrapper[4781]: I0227 00:16:02.299689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerDied","Data":"96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5"} Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.559024 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.677440 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") pod \"778d83b2-2e0c-45b3-a296-aaba355c6427\" (UID: \"778d83b2-2e0c-45b3-a296-aaba355c6427\") " Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.684259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm" (OuterVolumeSpecName: "kube-api-access-j29wm") pod "778d83b2-2e0c-45b3-a296-aaba355c6427" (UID: "778d83b2-2e0c-45b3-a296-aaba355c6427"). InnerVolumeSpecName "kube-api-access-j29wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:16:03 crc kubenswrapper[4781]: I0227 00:16:03.779886 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j29wm\" (UniqueName: \"kubernetes.io/projected/778d83b2-2e0c-45b3-a296-aaba355c6427-kube-api-access-j29wm\") on node \"crc\" DevicePath \"\"" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535856-mznwl" event={"ID":"778d83b2-2e0c-45b3-a296-aaba355c6427","Type":"ContainerDied","Data":"9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc"} Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315125 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7a921969dd474fa7e006e106226a59df03f75c725b3c79c4a135ad3cc753fc" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.315195 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535856-mznwl" Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.618376 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:16:04 crc kubenswrapper[4781]: I0227 00:16:04.621605 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535850-wzxmm"] Feb 27 00:16:05 crc kubenswrapper[4781]: I0227 00:16:05.335569 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acff23f-a17a-4f43-a7d6-32c8ccf4b084" path="/var/lib/kubelet/pods/6acff23f-a17a-4f43-a7d6-32c8ccf4b084/volumes" Feb 27 00:16:12 crc kubenswrapper[4781]: I0227 00:16:12.896257 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:16:12 crc kubenswrapper[4781]: I0227 00:16:12.896826 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:16:42 crc kubenswrapper[4781]: I0227 00:16:42.895854 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:16:42 crc kubenswrapper[4781]: I0227 00:16:42.896448 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.659889 4781 scope.go:117] "RemoveContainer" containerID="313dbdb071dff64579864e870a0b09038434fbe0ef138af4cad66cd56ba9ca0d" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.713356 4781 scope.go:117] "RemoveContainer" containerID="ec7472b1d4abe3539fd2b9c6a74552c975f1e7a845d80d7f3684a0e55a838de1" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.732380 4781 scope.go:117] "RemoveContainer" containerID="7a5bc22436045a92f14d9e48387b73688e7285010edca28bce2bf80e2706ff98" Feb 27 00:16:48 crc kubenswrapper[4781]: I0227 00:16:48.757163 4781 scope.go:117] "RemoveContainer" containerID="a316b4241144a66af579b620906b51669485f94b0371b42e5c56ba88e48d2942" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895056 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895660 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.895712 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.896276 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:17:12 crc kubenswrapper[4781]: I0227 00:17:12.896327 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" gracePeriod=600 Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.320427 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" exitCode=0 Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328294 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d"} Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} Feb 27 00:17:13 crc kubenswrapper[4781]: I0227 00:17:13.328984 4781 scope.go:117] "RemoveContainer" containerID="0a9584e9887d3110a6a6d2ad5c5024fb38c734637c177fd2cbddb2eae4932cdc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.565614 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:38 crc kubenswrapper[4781]: E0227 00:17:38.566432 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.566448 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.566570 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" containerName="oc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.567484 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.569881 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.578209 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675736 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.675832 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777700 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.777802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.779015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.779409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.799240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:38 crc kubenswrapper[4781]: I0227 00:17:38.896391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.108249 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9"] Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.503980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerStarted","Data":"f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a"} Feb 27 00:17:39 crc kubenswrapper[4781]: I0227 00:17:39.504020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerStarted","Data":"583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf"} Feb 27 00:17:40 crc kubenswrapper[4781]: I0227 00:17:40.520534 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a" exitCode=0 Feb 27 00:17:40 crc kubenswrapper[4781]: I0227 00:17:40.520729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"f54b7e9e179332b1994e5d183e219982590ec13795f05a038610a4dda166e81a"} Feb 27 00:17:41 crc kubenswrapper[4781]: I0227 00:17:41.527434 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="0e4334abb705666c2ccafacdc16e3c52ccf2d9fad5d1cd17b493c56925fc3ffc" exitCode=0 Feb 27 00:17:41 crc kubenswrapper[4781]: I0227 00:17:41.527582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"0e4334abb705666c2ccafacdc16e3c52ccf2d9fad5d1cd17b493c56925fc3ffc"} Feb 27 00:17:42 crc kubenswrapper[4781]: I0227 00:17:42.543955 4781 generic.go:334] "Generic (PLEG): container finished" podID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerID="63aec967b7a56bbc27060a22108d12825ab75ccaabd6f9eda49c69490997e3e4" exitCode=0 Feb 27 00:17:42 crc kubenswrapper[4781]: I0227 00:17:42.543994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"63aec967b7a56bbc27060a22108d12825ab75ccaabd6f9eda49c69490997e3e4"} Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.814745 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.946441 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.946589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949249 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle" (OuterVolumeSpecName: "bundle") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949309 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") pod \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\" (UID: \"d6e87b6c-eb25-4485-b639-6181c0ad86c7\") " Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.949650 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.952034 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr" (OuterVolumeSpecName: "kube-api-access-ffqqr") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "kube-api-access-ffqqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:17:43 crc kubenswrapper[4781]: I0227 00:17:43.975941 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util" (OuterVolumeSpecName: "util") pod "d6e87b6c-eb25-4485-b639-6181c0ad86c7" (UID: "d6e87b6c-eb25-4485-b639-6181c0ad86c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.051327 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e87b6c-eb25-4485-b639-6181c0ad86c7-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.051365 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffqqr\" (UniqueName: \"kubernetes.io/projected/d6e87b6c-eb25-4485-b639-6181c0ad86c7-kube-api-access-ffqqr\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" event={"ID":"d6e87b6c-eb25-4485-b639-6181c0ad86c7","Type":"ContainerDied","Data":"583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf"} Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561659 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9" Feb 27 00:17:44 crc kubenswrapper[4781]: I0227 00:17:44.561681 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="583dd8ad0d27e298ba7949507fe2a673ad3cc2c41e2f67d4ecf6a4498ef534cf" Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.891696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892529 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" containerID="cri-o://4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892587 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" containerID="cri-o://87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892657 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" containerID="cri-o://abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892696 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892709 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" containerID="cri-o://f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892727 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" containerID="cri-o://49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.892757 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" containerID="cri-o://7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" gracePeriod=30 Feb 27 00:17:49 crc kubenswrapper[4781]: I0227 00:17:49.938954 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" containerID="cri-o://ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" gracePeriod=30 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.185945 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/4.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.186515 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.188416 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-acl-logging/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.188988 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-controller/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.189426 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244753 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jfhx4"] Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.244964 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kubecfg-setup" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244979 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kubecfg-setup" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.244988 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.244994 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245001 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245009 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245016 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="util" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245022 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="util" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245029 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245044 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245050 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245059 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245065 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245074 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245082 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245089 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245095 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245113 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245126 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245134 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245147 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245155 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245178 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245185 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245192 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="pull" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245198 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="pull" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245372 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-acl-logging" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245385 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245392 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="nbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245400 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e87b6c-eb25-4485-b639-6181c0ad86c7" containerName="extract" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245408 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="northd" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245426 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245434 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245442 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="sbdb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245451 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovn-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245463 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="kube-rbac-proxy-node" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.245580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245588 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245721 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.245902 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerName="ovnkube-controller" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.247119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325597 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325676 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325715 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325737 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325734 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325866 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325936 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325963 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325968 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.325993 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326012 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326083 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326291 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326315 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") pod \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\" (UID: \"12a87c22-b4e1-4aa9-8b3e-a34f7d159239\") " Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326459 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326926 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326573 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326574 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket" (OuterVolumeSpecName: "log-socket") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326609 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash" (OuterVolumeSpecName: "host-slash") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log" (OuterVolumeSpecName: "node-log") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326701 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326720 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.326925 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327123 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327227 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327290 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327482 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327578 4781 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327589 4781 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-log-socket\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327599 4781 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-slash\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327609 4781 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-node-log\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327617 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327638 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327647 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327655 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327664 4781 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327673 4781 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327681 4781 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327689 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327697 4781 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327707 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327715 4781 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327724 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.327732 4781 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.337887 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg" (OuterVolumeSpecName: "kube-api-access-r5qlg") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "kube-api-access-r5qlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.338272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.338301 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "12a87c22-b4e1-4aa9-8b3e-a34f7d159239" (UID: "12a87c22-b4e1-4aa9-8b3e-a34f7d159239"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428597 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428644 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428737 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428822 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428888 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5qlg\" (UniqueName: \"kubernetes.io/projected/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-kube-api-access-r5qlg\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428961 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428971 4781 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/12a87c22-b4e1-4aa9-8b3e-a34f7d159239-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.428401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-bin\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-cni-netd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-script-lib\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-netns\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-run-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.429682 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-systemd\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-systemd-units\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-log-socket\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-slash\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-var-lib-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-run-ovn\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-kubelet\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.430993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-env-overrides\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovnkube-config\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-node-log\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.431557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/78f87967-e9e0-4e6a-ab3b-2216e4272c02-etc-openvswitch\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.436397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/78f87967-e9e0-4e6a-ab3b-2216e4272c02-ovn-node-metrics-cert\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.448067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jmz\" (UniqueName: \"kubernetes.io/projected/78f87967-e9e0-4e6a-ab3b-2216e4272c02-kube-api-access-58jmz\") pod \"ovnkube-node-jfhx4\" (UID: \"78f87967-e9e0-4e6a-ab3b-2216e4272c02\") " pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.559987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.597002 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601820 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/1.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601863 4781 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" exitCode=2 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerDied","Data":"a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.601983 4781 scope.go:117] "RemoveContainer" containerID="3baa523fad3e36a4728c991057de2da0f51fa6a92e36153c58a2fadc65bd7606" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.602487 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.602692 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.610748 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/4.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.611359 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovnkube-controller/3.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.615760 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-acl-logging/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616224 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d2zn6_12a87c22-b4e1-4aa9-8b3e-a34f7d159239/ovn-controller/0.log" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616766 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" exitCode=2 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616794 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616803 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616812 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616823 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616831 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" exitCode=0 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616838 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" exitCode=143 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616846 4781 generic.go:334] "Generic (PLEG): container finished" podID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" exitCode=143 Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.616867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617914 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617925 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617938 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617952 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617959 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617965 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617971 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617978 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617984 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617991 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.617997 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618003 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618026 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618033 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618039 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618045 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618051 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618056 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618063 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618072 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618078 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618085 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618105 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618115 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618123 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618129 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618135 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618142 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618148 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618154 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618161 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618167 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618176 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" event={"ID":"12a87c22-b4e1-4aa9-8b3e-a34f7d159239","Type":"ContainerDied","Data":"96af27195e73c8a72996dd4d8221316b5eec9c31c92a51b4fb0d127265c1c59f"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618185 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618193 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618200 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618206 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618212 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618218 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618224 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618230 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618235 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618241 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.618349 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d2zn6" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.661819 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.704151 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.705298 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.709888 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d2zn6"] Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.727949 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.748274 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.805566 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.820041 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.833718 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.848573 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.860793 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.877838 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.896876 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.897481 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.897519 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.897557 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.898858 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.898907 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.898941 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899265 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899300 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899324 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899606 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899652 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899670 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.899918 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899941 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.899955 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900179 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900210 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900230 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900456 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900488 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900504 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.900749 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900789 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.900817 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.901112 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901142 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901161 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: E0227 00:17:50.901727 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901760 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.901782 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902198 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902220 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902538 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902558 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902919 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.902936 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903195 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903217 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903544 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903569 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903834 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.903857 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904085 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904104 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904328 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904350 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904779 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.904825 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905303 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905330 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905557 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905578 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905785 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905803 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.905997 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906037 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906249 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906280 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906530 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906555 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906765 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906794 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.906989 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907014 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907200 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907223 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907518 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907557 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907797 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.907814 4781 scope.go:117] "RemoveContainer" containerID="ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908021 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c"} err="failed to get container status \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": rpc error: code = NotFound desc = could not find container \"ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c\": container with ID starting with ffc5b913df4b0e0895fa431a65bbf22c9d40987fcda4381b6ced5e5db9484d3c not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908059 4781 scope.go:117] "RemoveContainer" containerID="ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908237 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867"} err="failed to get container status \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": rpc error: code = NotFound desc = could not find container \"ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867\": container with ID starting with ef13dc26354a2a1f5b411390522557be9919e8574bc69587b97c6c08bf1ac867 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908264 4781 scope.go:117] "RemoveContainer" containerID="f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908468 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381"} err="failed to get container status \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": rpc error: code = NotFound desc = could not find container \"f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381\": container with ID starting with f41665230a873cda8c658d83ad1d755188504ac7cab15032a9e44e922ca1a381 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908489 4781 scope.go:117] "RemoveContainer" containerID="87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908728 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a"} err="failed to get container status \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": rpc error: code = NotFound desc = could not find container \"87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a\": container with ID starting with 87aa0d48a721a0d5c22857793a15eb4da436b38f87452343213f75018a2fc07a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908747 4781 scope.go:117] "RemoveContainer" containerID="abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908934 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a"} err="failed to get container status \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": rpc error: code = NotFound desc = could not find container \"abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a\": container with ID starting with abcc64169cfe244eb88b7f562d4f4c89bb0e9748ae389d1dc8b76aa15d94539a not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.908949 4781 scope.go:117] "RemoveContainer" containerID="6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909403 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9"} err="failed to get container status \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": rpc error: code = NotFound desc = could not find container \"6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9\": container with ID starting with 6a8f4fcc7a6a516bd015cc2381cf8fceac09a74680e3e0f31ce8ce8c0009d2d9 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909458 4781 scope.go:117] "RemoveContainer" containerID="49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909740 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87"} err="failed to get container status \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": rpc error: code = NotFound desc = could not find container \"49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87\": container with ID starting with 49fb063cc82780373f410cc552e393619bbb7f3bad6dafa6bd40cec08067de87 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.909763 4781 scope.go:117] "RemoveContainer" containerID="7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910097 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403"} err="failed to get container status \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": rpc error: code = NotFound desc = could not find container \"7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403\": container with ID starting with 7578f60a79fcd8573fe4918ab9b149fb45ef85bb6c4db6b8533e760363714403 not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910121 4781 scope.go:117] "RemoveContainer" containerID="4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910537 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d"} err="failed to get container status \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": rpc error: code = NotFound desc = could not find container \"4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d\": container with ID starting with 4ec4046cf3750b9da352ac7913ab9abc8a997fe62ad96c26bf0b78757aa4af0d not found: ID does not exist" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910561 4781 scope.go:117] "RemoveContainer" containerID="34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4" Feb 27 00:17:50 crc kubenswrapper[4781]: I0227 00:17:50.910924 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4"} err="failed to get container status \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": rpc error: code = NotFound desc = could not find container \"34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4\": container with ID starting with 34b0423d082a45436025a57b48a113da78a5a4f84361d8d74eee0d0c35b748d4 not found: ID does not exist" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.315548 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a87c22-b4e1-4aa9-8b3e-a34f7d159239" path="/var/lib/kubelet/pods/12a87c22-b4e1-4aa9-8b3e-a34f7d159239/volumes" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.640418 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.646954 4781 generic.go:334] "Generic (PLEG): container finished" podID="78f87967-e9e0-4e6a-ab3b-2216e4272c02" containerID="4779bf1ca393254a54cc03c243bf87d6a37de3a0aba2f25a6bd06c83d56ea5f0" exitCode=0 Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.646989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerDied","Data":"4779bf1ca393254a54cc03c243bf87d6a37de3a0aba2f25a6bd06c83d56ea5f0"} Feb 27 00:17:51 crc kubenswrapper[4781]: I0227 00:17:51.647009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"31e65d716a1aa553f7a005b056f0c642a16e5d75ee05277a14fb364aca8ff0b2"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"16fec0c4cec4317659e931ec067a32dbfee38a3efb068b50ec5c22d2ca58f2da"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"84cb6e9827d8d3a53a7e151312e0866cba75754fe5342a5148d2c122359894fd"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655756 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"c95d092c213b49db3f38593dd71b479dd138d69deb902eacf58674dbae8096e2"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655765 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"eda00b6b03c60717614c97557953d6fba2f73eca8083836778766458999f9c0f"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"b44befc8472b8b3fd3153503b3fa4f58b28a70c1b7695d6ff97ab13081d3feb7"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.655783 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"9f2591640a73dda244101bcceab9811e32c1623f492cb472232e7dad03b03a6a"} Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.913620 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.914490 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.917583 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-l667z" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.918233 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 27 00:17:52 crc kubenswrapper[4781]: I0227 00:17:52.919264 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.027867 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.028659 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.030248 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-s4267" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.030586 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.036005 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.036667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.061525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.153883 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.154519 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.156778 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-79fv9" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.156778 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.162856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.186572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwd7g\" (UniqueName: \"kubernetes.io/projected/c62f5f48-b15f-4d70-837c-a05addc48839-kube-api-access-zwd7g\") pod \"obo-prometheus-operator-68bc856cb9-rbdmr\" (UID: \"c62f5f48-b15f-4d70-837c-a05addc48839\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.231219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254197 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254288 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254316 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.254372 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(acf9c3d2dd034d323469dc4151d393eab24946aa7c15575d65ae985cb320fe12): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263925 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.263983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.264004 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.264035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbb658fa-808d-4c87-b81e-63863f31382f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m\" (UID: \"cbb658fa-808d-4c87-b81e-63863f31382f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.269381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.271283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5abff2aa-f9cb-469e-9a7e-7a6eea64d4db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7\" (UID: \"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.271905 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.272535 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.275785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6zvft" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.346876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.354417 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.365577 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.368312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe8e5f0-6c7b-42bd-9604-85a90477d143-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.387075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nmx\" (UniqueName: \"kubernetes.io/projected/3fe8e5f0-6c7b-42bd-9604-85a90477d143-kube-api-access-v9nmx\") pod \"observability-operator-59bdc8b94-m6jxs\" (UID: \"3fe8e5f0-6c7b-42bd-9604-85a90477d143\") " pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392588 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392696 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392724 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.392784 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(b9ad5d44391eec9fc267745f98887580747de5faa3ec900c41841eae19661de2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399453 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399500 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399523 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.399565 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(e9a3f918348b5c9dd835803ad19aab2cb18d72467945e7b9a49e7cca04aed52c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.466386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.466459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.467403 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a3a6a15-797e-4cfe-8e21-3a813460012d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.467684 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490191 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490245 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490263 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.490298 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(c9439ac03a626440c4a67097ff2d14fb741850684f6c8e5171ba7751d2e17119): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.490370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxpsj\" (UniqueName: \"kubernetes.io/projected/1a3a6a15-797e-4cfe-8e21-3a813460012d-kube-api-access-nxpsj\") pod \"perses-operator-5bf474d74f-l5ppf\" (UID: \"1a3a6a15-797e-4cfe-8e21-3a813460012d\") " pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: I0227 00:17:53.609016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629078 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629147 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629170 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:53 crc kubenswrapper[4781]: E0227 00:17:53.629216 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(a1316e086242627f09e5957be3f514dad12cd3073dc2d380126c3fa8511f7666): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:17:54 crc kubenswrapper[4781]: I0227 00:17:54.668798 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"75adc4ae198a13b0195a243660d4297b8663ce118937ac9ddf788b89c83b01b8"} Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.695986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" event={"ID":"78f87967-e9e0-4e6a-ab3b-2216e4272c02","Type":"ContainerStarted","Data":"51116f40c35ad454a2ffd20e3c51ff29acacefc1e89326249603a688f8c6e13a"} Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.696401 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.696414 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.727734 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" podStartSLOduration=7.727717015 podStartE2EDuration="7.727717015s" podCreationTimestamp="2026-02-27 00:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:17:57.727042699 +0000 UTC m=+746.984582253" watchObservedRunningTime="2026-02-27 00:17:57.727717015 +0000 UTC m=+746.985256569" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.735945 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.823765 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.823872 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.824182 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840258 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.840818 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.848811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.862573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.863310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881463 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881581 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881611 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.881716 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(3e2d4f447c77304e54d5e9ab985166aa513fdb1b35003b643f92150cfbc07400): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.893717 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.893819 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.894073 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.899886 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.900013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: I0227 00:17:57.900565 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.913948 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914084 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914112 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.914167 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(3d17b4f254747363491e0f9b5b7184aed40a0a05daa08d692096524b65a07108): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.922931 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.922988 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.923012 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.923052 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b8535a2fdff6659d45bcf9b47e61354b9449438e65a8c33e4182da8d8c3d277d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940066 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940132 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940153 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.940198 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(9f3f595c1473a4901329f59296c9a978e61e3ea75aed3a4f5f2d0034bd3424be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.958910 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.958987 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.959012 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:17:57 crc kubenswrapper[4781]: E0227 00:17:57.959067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(016693864dfc49d232fe7737deef5dea5375af5b1e910ee34530ac943cc9c15b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:17:58 crc kubenswrapper[4781]: I0227 00:17:58.700411 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:17:58 crc kubenswrapper[4781]: I0227 00:17:58.759000 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.123785 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.125156 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.126898 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.126941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.128160 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.133410 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.262526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.363745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.386105 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"auto-csr-approver-29535858-9fs8d\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.456893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483809 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483896 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483933 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.483976 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(a6422c13a22f0fd16b8f4c41ee16e0830dd21d1c8ffb88c80497e4e894942082): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.709099 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: I0227 00:18:00.709508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742427 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742484 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742504 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:00 crc kubenswrapper[4781]: E0227 00:18:00.742539 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535858-9fs8d_openshift-infra(3bb1e1bd-28ea-42f4-96d5-534db2674e68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535858-9fs8d_openshift-infra_3bb1e1bd-28ea-42f4-96d5-534db2674e68_0(d9421be3c1dae616d2afee4d7ac46bfab378f5bcb3f8439d332a0506ef9e5834): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" Feb 27 00:18:03 crc kubenswrapper[4781]: I0227 00:18:03.308970 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:18:03 crc kubenswrapper[4781]: E0227 00:18:03.309427 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tlstj_openshift-multus(9a6dd1e0-45ab-46f0-b298-d89e47aaeecb)\"" pod="openshift-multus/multus-tlstj" podUID="9a6dd1e0-45ab-46f0-b298-d89e47aaeecb" Feb 27 00:18:08 crc kubenswrapper[4781]: I0227 00:18:08.308434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: I0227 00:18:08.309183 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.338728 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339375 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339414 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:08 crc kubenswrapper[4781]: E0227 00:18:08.339463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators(cbb658fa-808d-4c87-b81e-63863f31382f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_openshift-operators_cbb658fa-808d-4c87-b81e-63863f31382f_0(f5e9e5e75916086dbc8cc15d24649fe4e7be5140046b48e992dd72871eb24e01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podUID="cbb658fa-808d-4c87-b81e-63863f31382f" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.308850 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.308893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309005 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: I0227 00:18:09.309783 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353782 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353899 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353923 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.353985 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators(c62f5f48-b15f-4d70-837c-a05addc48839)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rbdmr_openshift-operators_c62f5f48-b15f-4d70-837c-a05addc48839_0(6559c0ff8f272afb500e316558ef1183a20ceb233ac20054686125ad4553fed9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podUID="c62f5f48-b15f-4d70-837c-a05addc48839" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358212 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358271 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358291 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.358339 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-l5ppf_openshift-operators(1a3a6a15-797e-4cfe-8e21-3a813460012d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-l5ppf_openshift-operators_1a3a6a15-797e-4cfe-8e21-3a813460012d_0(0d64a666439c90f739396fc0b07da83d309c4ba88ae567bac4aaa83b4bffcd61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podUID="1a3a6a15-797e-4cfe-8e21-3a813460012d" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.362888 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363023 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363122 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:09 crc kubenswrapper[4781]: E0227 00:18:09.363252 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators(5abff2aa-f9cb-469e-9a7e-7a6eea64d4db)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_openshift-operators_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db_0(feef5134aede6983b37f1d89f1bd220bcfd8c9a6654973fda44ec2b9bcf0cc4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podUID="5abff2aa-f9cb-469e-9a7e-7a6eea64d4db" Feb 27 00:18:10 crc kubenswrapper[4781]: I0227 00:18:10.309340 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: I0227 00:18:10.309837 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334093 4781 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334239 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334316 4781 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:10 crc kubenswrapper[4781]: E0227 00:18:10.334412 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m6jxs_openshift-operators(3fe8e5f0-6c7b-42bd-9604-85a90477d143)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m6jxs_openshift-operators_3fe8e5f0-6c7b-42bd-9604-85a90477d143_0(b3a07f20b55adde8b5c8d332535c3de5d766b2de7cdd6e78706dadbf116fc5dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podUID="3fe8e5f0-6c7b-42bd-9604-85a90477d143" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.308823 4781 scope.go:117] "RemoveContainer" containerID="a286864c68415e96f38bba630ac2325989837881e34a926c93977715f330a129" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.788967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tlstj_9a6dd1e0-45ab-46f0-b298-d89e47aaeecb/kube-multus/2.log" Feb 27 00:18:14 crc kubenswrapper[4781]: I0227 00:18:14.789309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tlstj" event={"ID":"9a6dd1e0-45ab-46f0-b298-d89e47aaeecb","Type":"ContainerStarted","Data":"e8389900490984b4abb551a07e748ac9856ad8ae2f3078b664efd67dcaf5090c"} Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.309081 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.309924 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.543313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:18:16 crc kubenswrapper[4781]: W0227 00:18:16.548222 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb1e1bd_28ea_42f4_96d5_534db2674e68.slice/crio-5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6 WatchSource:0}: Error finding container 5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6: Status 404 returned error can't find the container with id 5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6 Feb 27 00:18:16 crc kubenswrapper[4781]: I0227 00:18:16.799073 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerStarted","Data":"5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6"} Feb 27 00:18:17 crc kubenswrapper[4781]: I0227 00:18:17.806295 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerStarted","Data":"86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f"} Feb 27 00:18:17 crc kubenswrapper[4781]: I0227 00:18:17.820100 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" podStartSLOduration=17.047292824 podStartE2EDuration="17.820079712s" podCreationTimestamp="2026-02-27 00:18:00 +0000 UTC" firstStartedPulling="2026-02-27 00:18:16.551528091 +0000 UTC m=+765.809067645" lastFinishedPulling="2026-02-27 00:18:17.324314939 +0000 UTC m=+766.581854533" observedRunningTime="2026-02-27 00:18:17.816605646 +0000 UTC m=+767.074145210" watchObservedRunningTime="2026-02-27 00:18:17.820079712 +0000 UTC m=+767.077619276" Feb 27 00:18:18 crc kubenswrapper[4781]: I0227 00:18:18.815357 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerID="86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f" exitCode=0 Feb 27 00:18:18 crc kubenswrapper[4781]: I0227 00:18:18.815416 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerDied","Data":"86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f"} Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.309267 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.309881 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.582448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jfhx4" Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.707460 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m"] Feb 27 00:18:20 crc kubenswrapper[4781]: W0227 00:18:20.718835 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb658fa_808d_4c87_b81e_63863f31382f.slice/crio-7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72 WatchSource:0}: Error finding container 7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72: Status 404 returned error can't find the container with id 7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72 Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.827476 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" event={"ID":"cbb658fa-808d-4c87-b81e-63863f31382f","Type":"ContainerStarted","Data":"7c438ce7c5544007d38c6cd9e1c6987673991fe6e99ffa2fb11d9d759b0abc72"} Feb 27 00:18:20 crc kubenswrapper[4781]: I0227 00:18:20.967259 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.031413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") pod \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\" (UID: \"3bb1e1bd-28ea-42f4-96d5-534db2674e68\") " Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.038888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t" (OuterVolumeSpecName: "kube-api-access-zl65t") pod "3bb1e1bd-28ea-42f4-96d5-534db2674e68" (UID: "3bb1e1bd-28ea-42f4-96d5-534db2674e68"). InnerVolumeSpecName "kube-api-access-zl65t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.132429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl65t\" (UniqueName: \"kubernetes.io/projected/3bb1e1bd-28ea-42f4-96d5-534db2674e68-kube-api-access-zl65t\") on node \"crc\" DevicePath \"\"" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.309248 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.321898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.737786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7"] Feb 27 00:18:21 crc kubenswrapper[4781]: W0227 00:18:21.744707 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5abff2aa_f9cb_469e_9a7e_7a6eea64d4db.slice/crio-793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4 WatchSource:0}: Error finding container 793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4: Status 404 returned error can't find the container with id 793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4 Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833517 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" event={"ID":"3bb1e1bd-28ea-42f4-96d5-534db2674e68","Type":"ContainerDied","Data":"5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6"} Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833579 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5753f73223e139dd23552ed49014d8187711d2c8fe095b49b3d277fc5cfa3bb6" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.833985 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535858-9fs8d" Feb 27 00:18:21 crc kubenswrapper[4781]: I0227 00:18:21.835226 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" event={"ID":"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db","Type":"ContainerStarted","Data":"793ec7f20ef3785eba07b672c006673b2739fb315e82e8c929f71449c39bd1b4"} Feb 27 00:18:22 crc kubenswrapper[4781]: I0227 00:18:22.036746 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:18:22 crc kubenswrapper[4781]: I0227 00:18:22.040530 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535852-49dfn"] Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.310853 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.311538 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.316500 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ecbd6e-c579-40ca-a5bf-9876777721f9" path="/var/lib/kubelet/pods/96ecbd6e-c579-40ca-a5bf-9876777721f9/volumes" Feb 27 00:18:23 crc kubenswrapper[4781]: I0227 00:18:23.698667 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr"] Feb 27 00:18:24 crc kubenswrapper[4781]: I0227 00:18:24.309224 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:24 crc kubenswrapper[4781]: I0227 00:18:24.309749 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.092341 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc62f5f48_b15f_4d70_837c_a05addc48839.slice/crio-437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f WatchSource:0}: Error finding container 437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f: Status 404 returned error can't find the container with id 437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.286897 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l5ppf"] Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.291114 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3a6a15_797e_4cfe_8e21_3a813460012d.slice/crio-be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558 WatchSource:0}: Error finding container be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558: Status 404 returned error can't find the container with id be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558 Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.309367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.309847 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.763018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m6jxs"] Feb 27 00:18:25 crc kubenswrapper[4781]: W0227 00:18:25.768914 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe8e5f0_6c7b_42bd_9604_85a90477d143.slice/crio-955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da WatchSource:0}: Error finding container 955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da: Status 404 returned error can't find the container with id 955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.878697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" event={"ID":"cbb658fa-808d-4c87-b81e-63863f31382f","Type":"ContainerStarted","Data":"20271a79cce7f288f3687287e215ae8c9185ee9c5d926a46a261cc4110ea2da5"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.880717 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" event={"ID":"3fe8e5f0-6c7b-42bd-9604-85a90477d143","Type":"ContainerStarted","Data":"955c9789b0bffc9aab82cb525b68f5f6f853e560a21e4930ac4fc52cb75812da"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.882168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" event={"ID":"5abff2aa-f9cb-469e-9a7e-7a6eea64d4db","Type":"ContainerStarted","Data":"a2fceaf7e740c7abd736297d999fdb938fda61424b439e9b244cba064b1fa240"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.883548 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" event={"ID":"c62f5f48-b15f-4d70-837c-a05addc48839","Type":"ContainerStarted","Data":"437727a4cb9884dae84df7c6049639908951feb4238c0dfe9b2c0ae180a02a8f"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.884842 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" event={"ID":"1a3a6a15-797e-4cfe-8e21-3a813460012d","Type":"ContainerStarted","Data":"be4d989c5d40629c9ffc9d258e8cc02cceb9e3af824894dd87ad3b8e8676f558"} Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.915284 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m" podStartSLOduration=28.441178941 podStartE2EDuration="32.915261722s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:20.721315406 +0000 UTC m=+769.978854960" lastFinishedPulling="2026-02-27 00:18:25.195398177 +0000 UTC m=+774.452937741" observedRunningTime="2026-02-27 00:18:25.90219526 +0000 UTC m=+775.159734824" watchObservedRunningTime="2026-02-27 00:18:25.915261722 +0000 UTC m=+775.172801286" Feb 27 00:18:25 crc kubenswrapper[4781]: I0227 00:18:25.927229 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7" podStartSLOduration=29.505241674 podStartE2EDuration="32.927202516s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:21.751132696 +0000 UTC m=+771.008672250" lastFinishedPulling="2026-02-27 00:18:25.173093538 +0000 UTC m=+774.430633092" observedRunningTime="2026-02-27 00:18:25.924869969 +0000 UTC m=+775.182409543" watchObservedRunningTime="2026-02-27 00:18:25.927202516 +0000 UTC m=+775.184742110" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.899844 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" event={"ID":"c62f5f48-b15f-4d70-837c-a05addc48839","Type":"ContainerStarted","Data":"5777fade9968bc3391c3e077c86044a6b6d19da1ff33ed02c9e469efeedf7926"} Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.904726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" event={"ID":"1a3a6a15-797e-4cfe-8e21-3a813460012d","Type":"ContainerStarted","Data":"593e252543ed3cf9f71943b58ff742bcafa500a84e583914feb39957ef52fa7e"} Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.904893 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.927721 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rbdmr" podStartSLOduration=34.066664131 podStartE2EDuration="36.927692805s" podCreationTimestamp="2026-02-27 00:17:52 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.098303035 +0000 UTC m=+774.355842629" lastFinishedPulling="2026-02-27 00:18:27.959331749 +0000 UTC m=+777.216871303" observedRunningTime="2026-02-27 00:18:28.920333283 +0000 UTC m=+778.177872837" watchObservedRunningTime="2026-02-27 00:18:28.927692805 +0000 UTC m=+778.185232389" Feb 27 00:18:28 crc kubenswrapper[4781]: I0227 00:18:28.942114 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" podStartSLOduration=33.272588564 podStartE2EDuration="35.94209555s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.293312329 +0000 UTC m=+774.550851893" lastFinishedPulling="2026-02-27 00:18:27.962819325 +0000 UTC m=+777.220358879" observedRunningTime="2026-02-27 00:18:28.941935046 +0000 UTC m=+778.199474680" watchObservedRunningTime="2026-02-27 00:18:28.94209555 +0000 UTC m=+778.199635114" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.931263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" event={"ID":"3fe8e5f0-6c7b-42bd-9604-85a90477d143","Type":"ContainerStarted","Data":"3d7061a7f9a3a53dbea34bab246ba3a79a5c25b34f0e8fab1915465df308be59"} Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.931765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.959481 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" podStartSLOduration=33.330624535 podStartE2EDuration="37.959455349s" podCreationTimestamp="2026-02-27 00:17:53 +0000 UTC" firstStartedPulling="2026-02-27 00:18:25.771578142 +0000 UTC m=+775.029117706" lastFinishedPulling="2026-02-27 00:18:30.400408956 +0000 UTC m=+779.657948520" observedRunningTime="2026-02-27 00:18:30.956933057 +0000 UTC m=+780.214472641" watchObservedRunningTime="2026-02-27 00:18:30.959455349 +0000 UTC m=+780.216994973" Feb 27 00:18:30 crc kubenswrapper[4781]: I0227 00:18:30.982306 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-m6jxs" Feb 27 00:18:33 crc kubenswrapper[4781]: I0227 00:18:33.611225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-l5ppf" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.345959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: E0227 00:18:37.346166 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346178 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346278 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" containerName="oc" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.346645 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.348539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wgssp" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.348601 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.351771 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.361216 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.370350 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.371019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.372680 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bmw8p" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.388769 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.389506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.390996 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b574w" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.398076 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.426460 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476590 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.476890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578472 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.578523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.603362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4dx\" (UniqueName: \"kubernetes.io/projected/af9e6ffa-5ea0-473d-9e75-a2715093490f-kube-api-access-bz4dx\") pod \"cert-manager-cainjector-cf98fcc89-95z7d\" (UID: \"af9e6ffa-5ea0-473d-9e75-a2715093490f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.604262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnzs\" (UniqueName: \"kubernetes.io/projected/b732ab89-7ea1-4378-9511-229ee7fa787f-kube-api-access-qwnzs\") pod \"cert-manager-webhook-687f57d79b-rwwkv\" (UID: \"b732ab89-7ea1-4378-9511-229ee7fa787f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.604893 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brf6d\" (UniqueName: \"kubernetes.io/projected/749ed3fc-65b7-4674-a1b1-0433692d2d89-kube-api-access-brf6d\") pod \"cert-manager-858654f9db-mwpvm\" (UID: \"749ed3fc-65b7-4674-a1b1-0433692d2d89\") " pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.658879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.724721 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mwpvm" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.736104 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:37 crc kubenswrapper[4781]: I0227 00:18:37.936746 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-95z7d"] Feb 27 00:18:37 crc kubenswrapper[4781]: W0227 00:18:37.986440 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf9e6ffa_5ea0_473d_9e75_a2715093490f.slice/crio-e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04 WatchSource:0}: Error finding container e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04: Status 404 returned error can't find the container with id e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04 Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.146368 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mwpvm"] Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.274957 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rwwkv"] Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.979050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" event={"ID":"af9e6ffa-5ea0-473d-9e75-a2715093490f","Type":"ContainerStarted","Data":"e6e7329a2c7276542a8c6bd7df5c4d934edd7576f427102979fb231ad7e7bc04"} Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.980085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mwpvm" event={"ID":"749ed3fc-65b7-4674-a1b1-0433692d2d89","Type":"ContainerStarted","Data":"15542c648efe93c26b17bc9bbb1df17d6db9b229ceedac117dc458cdd98987e1"} Feb 27 00:18:38 crc kubenswrapper[4781]: I0227 00:18:38.980729 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" event={"ID":"b732ab89-7ea1-4378-9511-229ee7fa787f","Type":"ContainerStarted","Data":"c9abdf86c1a1108c4f4b029cf3c330a6dd7c3586bd21f0a2c9e9c7bd4dc31d29"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.009144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" event={"ID":"af9e6ffa-5ea0-473d-9e75-a2715093490f","Type":"ContainerStarted","Data":"190fb6ed70375b908ee29b76f2325b6774ab4fbf5600c1747aa43895ad26e05a"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.010695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mwpvm" event={"ID":"749ed3fc-65b7-4674-a1b1-0433692d2d89","Type":"ContainerStarted","Data":"1fadddfe605cb826ecea402b69310323c2f9bbbd528bd35706eae7b2d853bbda"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.012495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" event={"ID":"b732ab89-7ea1-4378-9511-229ee7fa787f","Type":"ContainerStarted","Data":"5bc2cd1877bd267dc02009ca544c492d56d95e8421237a70276854b347449883"} Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.012739 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.027111 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-95z7d" podStartSLOduration=2.244173228 podStartE2EDuration="7.027087866s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:37.992929083 +0000 UTC m=+787.250468637" lastFinishedPulling="2026-02-27 00:18:42.775843681 +0000 UTC m=+792.033383275" observedRunningTime="2026-02-27 00:18:44.024931963 +0000 UTC m=+793.282471517" watchObservedRunningTime="2026-02-27 00:18:44.027087866 +0000 UTC m=+793.284627420" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.044422 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" podStartSLOduration=2.468197448 podStartE2EDuration="7.044404823s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:38.280350994 +0000 UTC m=+787.537890548" lastFinishedPulling="2026-02-27 00:18:42.856558349 +0000 UTC m=+792.114097923" observedRunningTime="2026-02-27 00:18:44.043367447 +0000 UTC m=+793.300907021" watchObservedRunningTime="2026-02-27 00:18:44.044404823 +0000 UTC m=+793.301944377" Feb 27 00:18:44 crc kubenswrapper[4781]: I0227 00:18:44.073401 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-mwpvm" podStartSLOduration=2.452774427 podStartE2EDuration="7.073376516s" podCreationTimestamp="2026-02-27 00:18:37 +0000 UTC" firstStartedPulling="2026-02-27 00:18:38.157084457 +0000 UTC m=+787.414624011" lastFinishedPulling="2026-02-27 00:18:42.777686546 +0000 UTC m=+792.035226100" observedRunningTime="2026-02-27 00:18:44.066287552 +0000 UTC m=+793.323827116" watchObservedRunningTime="2026-02-27 00:18:44.073376516 +0000 UTC m=+793.330916080" Feb 27 00:18:48 crc kubenswrapper[4781]: I0227 00:18:48.869504 4781 scope.go:117] "RemoveContainer" containerID="5a1ffc2079241a21de7cc919695abf3baba7e2af15f91ad7d2c4786574ddb8a4" Feb 27 00:18:52 crc kubenswrapper[4781]: I0227 00:18:52.739139 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rwwkv" Feb 27 00:19:16 crc kubenswrapper[4781]: I0227 00:19:16.555908 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.785579 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.786813 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.791601 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.800965 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829801 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.829934 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.931956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.932208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:20 crc kubenswrapper[4781]: I0227 00:19:20.951828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:21 crc kubenswrapper[4781]: I0227 00:19:21.101657 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:21 crc kubenswrapper[4781]: I0227 00:19:21.609874 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft"] Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301562 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="bc7287992fd979e253afb46a18f7f269a613b396ec959609a258dd64fecc5b22" exitCode=0 Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"bc7287992fd979e253afb46a18f7f269a613b396ec959609a258dd64fecc5b22"} Feb 27 00:19:22 crc kubenswrapper[4781]: I0227 00:19:22.301728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerStarted","Data":"7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d"} Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.161702 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.165962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.205344 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.206456 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.212371 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.212899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.218711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.229524 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289773 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289814 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.289887 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.390934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.390990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391101 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.391763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.419204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"redhat-operators-cxq4v\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.492337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.492480 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.498368 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.498405 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68e97b9e042bbbb942f612a5d441bf2a1f903b9844cb564189e4809c6edaf34d/globalmount\"" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.511346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtv88\" (UniqueName: \"kubernetes.io/projected/a0aec676-41f4-4855-a823-2a3b21cbe197-kube-api-access-wtv88\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.518999 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7f58c6f-f8ba-472f-9b8e-22ada42e91f0\") pod \"minio\" (UID: \"a0aec676-41f4-4855-a823-2a3b21cbe197\") " pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.529973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.536379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 27 00:19:23 crc kubenswrapper[4781]: I0227 00:19:23.824004 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 27 00:19:23 crc kubenswrapper[4781]: W0227 00:19:23.827731 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0aec676_41f4_4855_a823_2a3b21cbe197.slice/crio-a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6 WatchSource:0}: Error finding container a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6: Status 404 returned error can't find the container with id a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.041153 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.334291 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="7aaedda8a4b57bd598af91df724a3a264b47d5539f95c9ab714487d376f1cf73" exitCode=0 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.334360 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"7aaedda8a4b57bd598af91df724a3a264b47d5539f95c9ab714487d376f1cf73"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.341165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a0aec676-41f4-4855-a823-2a3b21cbe197","Type":"ContainerStarted","Data":"a1050fb5b012640328c623fbff0bbdc041549c7d210a82dfd0c46ed2ad129dc6"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343421 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" exitCode=0 Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a"} Feb 27 00:19:24 crc kubenswrapper[4781]: I0227 00:19:24.343466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"b3755082d116f31051d75af049033b929644572e35832ae06d2d8defd5e385bf"} Feb 27 00:19:25 crc kubenswrapper[4781]: I0227 00:19:25.352412 4781 generic.go:334] "Generic (PLEG): container finished" podID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerID="9dea4a634c964f942cc980c671d7554dfbcca7841dbc0cac25ce15174d9577d9" exitCode=0 Feb 27 00:19:25 crc kubenswrapper[4781]: I0227 00:19:25.352473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"9dea4a634c964f942cc980c671d7554dfbcca7841dbc0cac25ce15174d9577d9"} Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.652366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.741508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.741922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.742113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") pod \"b41e2a48-4103-4cf3-be92-92180cbb2510\" (UID: \"b41e2a48-4103-4cf3-be92-92180cbb2510\") " Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.743777 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle" (OuterVolumeSpecName: "bundle") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.747781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t" (OuterVolumeSpecName: "kube-api-access-gt87t") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "kube-api-access-gt87t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.756422 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util" (OuterVolumeSpecName: "util") pod "b41e2a48-4103-4cf3-be92-92180cbb2510" (UID: "b41e2a48-4103-4cf3-be92-92180cbb2510"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843397 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843432 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt87t\" (UniqueName: \"kubernetes.io/projected/b41e2a48-4103-4cf3-be92-92180cbb2510-kube-api-access-gt87t\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:26 crc kubenswrapper[4781]: I0227 00:19:26.843444 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b41e2a48-4103-4cf3-be92-92180cbb2510-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.364948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a0aec676-41f4-4855-a823-2a3b21cbe197","Type":"ContainerStarted","Data":"21bbec858d72f4a89b6f19914c34e02ea4d0991ee8b28d1b141c8826b9f5bd89"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.367216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" event={"ID":"b41e2a48-4103-4cf3-be92-92180cbb2510","Type":"ContainerDied","Data":"7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d"} Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369280 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2eca92294c9202715d81184c0e75a00c33a2fd2d34bb6cf07794bd3af6de5d" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.369320 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft" Feb 27 00:19:27 crc kubenswrapper[4781]: I0227 00:19:27.381917 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.277095506 podStartE2EDuration="7.381896885s" podCreationTimestamp="2026-02-27 00:19:20 +0000 UTC" firstStartedPulling="2026-02-27 00:19:23.829871438 +0000 UTC m=+833.087410992" lastFinishedPulling="2026-02-27 00:19:26.934672817 +0000 UTC m=+836.192212371" observedRunningTime="2026-02-27 00:19:27.377127127 +0000 UTC m=+836.634666691" watchObservedRunningTime="2026-02-27 00:19:27.381896885 +0000 UTC m=+836.639436449" Feb 27 00:19:28 crc kubenswrapper[4781]: I0227 00:19:28.379749 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" exitCode=0 Feb 27 00:19:28 crc kubenswrapper[4781]: I0227 00:19:28.379819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} Feb 27 00:19:29 crc kubenswrapper[4781]: I0227 00:19:29.387135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerStarted","Data":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} Feb 27 00:19:29 crc kubenswrapper[4781]: I0227 00:19:29.414645 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxq4v" podStartSLOduration=2.007851489 podStartE2EDuration="6.414603141s" podCreationTimestamp="2026-02-27 00:19:23 +0000 UTC" firstStartedPulling="2026-02-27 00:19:24.349571602 +0000 UTC m=+833.607111156" lastFinishedPulling="2026-02-27 00:19:28.756323254 +0000 UTC m=+838.013862808" observedRunningTime="2026-02-27 00:19:29.412448728 +0000 UTC m=+838.669988272" watchObservedRunningTime="2026-02-27 00:19:29.414603141 +0000 UTC m=+838.672142695" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.531737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.532278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579072 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579290 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="pull" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579306 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="pull" Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579319 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579325 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: E0227 00:19:33.579333 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="util" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579340 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="util" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579449 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41e2a48-4103-4cf3-be92-92180cbb2510" containerName="extract" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.579988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.582592 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.582695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583271 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-7krlt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583284 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.583549 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.587424 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.598061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739178 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739254 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739302 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.739467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840864 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.840940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.842381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-manager-config\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.849850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-apiservice-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.850870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.861342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcjsq\" (UniqueName: \"kubernetes.io/projected/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-kube-api-access-pcjsq\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.863747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fed4c33-9f3f-486b-8f74-f2d9a09b92be-webhook-cert\") pod \"loki-operator-controller-manager-6c4cf64b95-qzbxj\" (UID: \"1fed4c33-9f3f-486b-8f74-f2d9a09b92be\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:33 crc kubenswrapper[4781]: I0227 00:19:33.892534 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.136852 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj"] Feb 27 00:19:34 crc kubenswrapper[4781]: W0227 00:19:34.138763 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fed4c33_9f3f_486b_8f74_f2d9a09b92be.slice/crio-91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772 WatchSource:0}: Error finding container 91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772: Status 404 returned error can't find the container with id 91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772 Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.412689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"91e075b25116dfc1152f7bbf22538b3b541d447572a496588f2a3174019e9772"} Feb 27 00:19:34 crc kubenswrapper[4781]: I0227 00:19:34.614257 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxq4v" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" probeResult="failure" output=< Feb 27 00:19:34 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:19:34 crc kubenswrapper[4781]: > Feb 27 00:19:39 crc kubenswrapper[4781]: I0227 00:19:39.440867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"24689dd834be85cbfe91a39df3ce366aa3d20f4672567cd3f86ec05888c16cd8"} Feb 27 00:19:42 crc kubenswrapper[4781]: I0227 00:19:42.895903 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:19:42 crc kubenswrapper[4781]: I0227 00:19:42.896267 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:19:43 crc kubenswrapper[4781]: I0227 00:19:43.575084 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:43 crc kubenswrapper[4781]: I0227 00:19:43.620637 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:44 crc kubenswrapper[4781]: I0227 00:19:44.760562 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:45 crc kubenswrapper[4781]: I0227 00:19:45.479977 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxq4v" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" containerID="cri-o://3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" gracePeriod=2 Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.077714 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147059 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147123 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.147149 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") pod \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\" (UID: \"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6\") " Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.148877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities" (OuterVolumeSpecName: "utilities") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.153046 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742" (OuterVolumeSpecName: "kube-api-access-v2742") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "kube-api-access-v2742". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.248958 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.249015 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2742\" (UniqueName: \"kubernetes.io/projected/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-kube-api-access-v2742\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.251143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" (UID: "9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.350374 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.489008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" event={"ID":"1fed4c33-9f3f-486b-8f74-f2d9a09b92be","Type":"ContainerStarted","Data":"3def8960b32d0635eb7789a6c586b8b43152158956b19377792b52cb099ee428"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.489434 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491528 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491866 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" exitCode=0 Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxq4v" event={"ID":"9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6","Type":"ContainerDied","Data":"b3755082d116f31051d75af049033b929644572e35832ae06d2d8defd5e385bf"} Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.492011 4781 scope.go:117] "RemoveContainer" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.491940 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxq4v" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.522360 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6c4cf64b95-qzbxj" podStartSLOduration=1.749427658 podStartE2EDuration="13.522341528s" podCreationTimestamp="2026-02-27 00:19:33 +0000 UTC" firstStartedPulling="2026-02-27 00:19:34.142643429 +0000 UTC m=+843.400182993" lastFinishedPulling="2026-02-27 00:19:45.915557309 +0000 UTC m=+855.173096863" observedRunningTime="2026-02-27 00:19:46.515037262 +0000 UTC m=+855.772576856" watchObservedRunningTime="2026-02-27 00:19:46.522341528 +0000 UTC m=+855.779881092" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.527491 4781 scope.go:117] "RemoveContainer" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.541596 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.549086 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxq4v"] Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.563929 4781 scope.go:117] "RemoveContainer" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.585023 4781 scope.go:117] "RemoveContainer" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.586311 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": container with ID starting with 3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae not found: ID does not exist" containerID="3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586355 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae"} err="failed to get container status \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": rpc error: code = NotFound desc = could not find container \"3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae\": container with ID starting with 3a8d72984dd0ec9c88fe8f70ea8aeb7fd6f907a54e516ca4b6481b0e08d04eae not found: ID does not exist" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586383 4781 scope.go:117] "RemoveContainer" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.586934 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": container with ID starting with fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28 not found: ID does not exist" containerID="fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.586993 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28"} err="failed to get container status \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": rpc error: code = NotFound desc = could not find container \"fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28\": container with ID starting with fd49d241062e4b0f19f4f873a028e0428f598ee6154fea1566f06dac0677af28 not found: ID does not exist" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.587029 4781 scope.go:117] "RemoveContainer" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: E0227 00:19:46.587479 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": container with ID starting with c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a not found: ID does not exist" containerID="c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a" Feb 27 00:19:46 crc kubenswrapper[4781]: I0227 00:19:46.587525 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a"} err="failed to get container status \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": rpc error: code = NotFound desc = could not find container \"c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a\": container with ID starting with c2ec72ae5d91fca7c3848ad6631e6e984303ab9557e9cc6207b8fe35ff35e59a not found: ID does not exist" Feb 27 00:19:47 crc kubenswrapper[4781]: I0227 00:19:47.317710 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" path="/var/lib/kubelet/pods/9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6/volumes" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.139929 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140617 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140642 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140653 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-utilities" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140659 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-utilities" Feb 27 00:20:00 crc kubenswrapper[4781]: E0227 00:20:00.140671 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-content" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140677 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="extract-content" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.140781 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4b8c95-a95b-49b0-be7f-5a9aa2e6caa6" containerName="registry-server" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.141130 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.143669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.145051 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.145833 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.157792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.225883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.327007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.350237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"auto-csr-approver-29535860-d6xsb\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.458397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:00 crc kubenswrapper[4781]: I0227 00:20:00.912212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:20:01 crc kubenswrapper[4781]: I0227 00:20:01.598034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerStarted","Data":"ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b"} Feb 27 00:20:02 crc kubenswrapper[4781]: I0227 00:20:02.606461 4781 generic.go:334] "Generic (PLEG): container finished" podID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerID="3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b" exitCode=0 Feb 27 00:20:02 crc kubenswrapper[4781]: I0227 00:20:02.606579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerDied","Data":"3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b"} Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.913414 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.972980 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") pod \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\" (UID: \"c8ba504f-040f-4632-b5d0-4b28aef8d27e\") " Feb 27 00:20:03 crc kubenswrapper[4781]: I0227 00:20:03.980265 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28" (OuterVolumeSpecName: "kube-api-access-76h28") pod "c8ba504f-040f-4632-b5d0-4b28aef8d27e" (UID: "c8ba504f-040f-4632-b5d0-4b28aef8d27e"). InnerVolumeSpecName "kube-api-access-76h28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.075105 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76h28\" (UniqueName: \"kubernetes.io/projected/c8ba504f-040f-4632-b5d0-4b28aef8d27e-kube-api-access-76h28\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" event={"ID":"c8ba504f-040f-4632-b5d0-4b28aef8d27e","Type":"ContainerDied","Data":"ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b"} Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622621 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbaa7e5cfe4109f4715244e441a89cf7a8b9e24ee4b9e5fe796cdd5c1b58c0b" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.622733 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535860-d6xsb" Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.974618 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:20:04 crc kubenswrapper[4781]: I0227 00:20:04.983960 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535854-lplm8"] Feb 27 00:20:05 crc kubenswrapper[4781]: I0227 00:20:05.321685 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2676f22-56e0-46ed-83d0-4d29fc704155" path="/var/lib/kubelet/pods/d2676f22-56e0-46ed-83d0-4d29fc704155/volumes" Feb 27 00:20:12 crc kubenswrapper[4781]: I0227 00:20:12.896093 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:20:12 crc kubenswrapper[4781]: I0227 00:20:12.897010 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.134935 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: E0227 00:20:18.135676 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.135690 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.135817 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" containerName="oc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.136515 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.138516 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.144822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.264793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.265103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.265172 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.366762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.367204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.367204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.388032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.455738 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.668931 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676"] Feb 27 00:20:18 crc kubenswrapper[4781]: I0227 00:20:18.735779 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerStarted","Data":"357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2"} Feb 27 00:20:19 crc kubenswrapper[4781]: I0227 00:20:19.744828 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="a27f23d6b7b5be18e741a4b8807d607d5da81ed53b1da7f9d1aab51b6d6a45c3" exitCode=0 Feb 27 00:20:19 crc kubenswrapper[4781]: I0227 00:20:19.744900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"a27f23d6b7b5be18e741a4b8807d607d5da81ed53b1da7f9d1aab51b6d6a45c3"} Feb 27 00:20:21 crc kubenswrapper[4781]: E0227 00:20:21.406145 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26f6c49_1028_49bf_9259_4c08b835cfbb.slice/crio-81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:20:21 crc kubenswrapper[4781]: I0227 00:20:21.777313 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6" exitCode=0 Feb 27 00:20:21 crc kubenswrapper[4781]: I0227 00:20:21.777398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"81b733e8736de15432ce35c1b163d861ec3f83f233c9edb6595f87fd26ff80c6"} Feb 27 00:20:22 crc kubenswrapper[4781]: I0227 00:20:22.783441 4781 generic.go:334] "Generic (PLEG): container finished" podID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerID="7238e183748cbed223e390e70c50dd92d5f55a3b96c7fbb9aa940db829fde41c" exitCode=0 Feb 27 00:20:22 crc kubenswrapper[4781]: I0227 00:20:22.783516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"7238e183748cbed223e390e70c50dd92d5f55a3b96c7fbb9aa940db829fde41c"} Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.074857 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143326 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.143480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") pod \"f26f6c49-1028-49bf-9259-4c08b835cfbb\" (UID: \"f26f6c49-1028-49bf-9259-4c08b835cfbb\") " Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.144239 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle" (OuterVolumeSpecName: "bundle") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.148550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m" (OuterVolumeSpecName: "kube-api-access-dxr5m") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "kube-api-access-dxr5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.162346 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util" (OuterVolumeSpecName: "util") pod "f26f6c49-1028-49bf-9259-4c08b835cfbb" (UID: "f26f6c49-1028-49bf-9259-4c08b835cfbb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245657 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxr5m\" (UniqueName: \"kubernetes.io/projected/f26f6c49-1028-49bf-9259-4c08b835cfbb-kube-api-access-dxr5m\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245703 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.245725 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26f6c49-1028-49bf-9259-4c08b835cfbb-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" event={"ID":"f26f6c49-1028-49bf-9259-4c08b835cfbb","Type":"ContainerDied","Data":"357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2"} Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808350 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357800bcda8de6f9e140d7c6d00b143ef325514626de6c9a5edf60efe4d0eda2" Feb 27 00:20:24 crc kubenswrapper[4781]: I0227 00:20:24.808356 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.737896 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738509 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738525 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738547 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="pull" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738554 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="pull" Feb 27 00:20:27 crc kubenswrapper[4781]: E0227 00:20:27.738570 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="util" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738579 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="util" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.738717 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26f6c49-1028-49bf-9259-4c08b835cfbb" containerName="extract" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.739204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.741835 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2vq77" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.741972 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.742415 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.797174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.833149 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.898591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:27 crc kubenswrapper[4781]: I0227 00:20:27.928067 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwqv\" (UniqueName: \"kubernetes.io/projected/e948619f-a0f4-4463-9076-e593529e4264-kube-api-access-wkwqv\") pod \"nmstate-operator-75c5dccd6c-m8kqs\" (UID: \"e948619f-a0f4-4463-9076-e593529e4264\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.055649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.299823 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs"] Feb 27 00:20:28 crc kubenswrapper[4781]: I0227 00:20:28.831202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" event={"ID":"e948619f-a0f4-4463-9076-e593529e4264","Type":"ContainerStarted","Data":"2b36192aa6f618f46a1e4679c7a5e3283e805c87fab274e79464a81048ebf24a"} Feb 27 00:20:31 crc kubenswrapper[4781]: I0227 00:20:31.864601 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" event={"ID":"e948619f-a0f4-4463-9076-e593529e4264","Type":"ContainerStarted","Data":"39e4b0f10e19f7bb1761553b73ecde98df0a78be2400cf4f599095a526b8a0d7"} Feb 27 00:20:31 crc kubenswrapper[4781]: I0227 00:20:31.892149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-m8kqs" podStartSLOduration=2.07249484 podStartE2EDuration="4.892120922s" podCreationTimestamp="2026-02-27 00:20:27 +0000 UTC" firstStartedPulling="2026-02-27 00:20:28.307236449 +0000 UTC m=+897.564776003" lastFinishedPulling="2026-02-27 00:20:31.126862531 +0000 UTC m=+900.384402085" observedRunningTime="2026-02-27 00:20:31.885241407 +0000 UTC m=+901.142780981" watchObservedRunningTime="2026-02-27 00:20:31.892120922 +0000 UTC m=+901.149660506" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.898092 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.898906 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.901523 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bq8xv" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.915655 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.916544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.919010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.935523 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.953210 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-r6fjq"] Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963413 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.963458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.971672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:32 crc kubenswrapper[4781]: I0227 00:20:32.988021 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.041858 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.042559 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.045941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.045947 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hfnqh" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.047375 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.052609 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.064986 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065094 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: E0227 00:20:33.065098 4781 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: E0227 00:20:33.065165 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair podName:677ca1f7-513f-4de1-b64b-66b2524b82a1 nodeName:}" failed. No retries permitted until 2026-02-27 00:20:33.565146018 +0000 UTC m=+902.822685572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair") pod "nmstate-webhook-786f45cff4-vrv7p" (UID: "677ca1f7-513f-4de1-b64b-66b2524b82a1") : secret "openshift-nmstate-webhook" not found Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.065180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.095580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmh9w\" (UniqueName: \"kubernetes.io/projected/677ca1f7-513f-4de1-b64b-66b2524b82a1-kube-api-access-cmh9w\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.098158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqrm\" (UniqueName: \"kubernetes.io/projected/2b001223-04cf-4a45-843b-e62c5d13ac14-kube-api-access-xbqrm\") pod \"nmstate-metrics-69594cc75-4d4ds\" (UID: \"2b001223-04cf-4a45-843b-e62c5d13ac14\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166705 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166728 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166829 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-ovs-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.166941 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-nmstate-lock\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.167173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f7bf5593-bd4f-462d-bcbf-319b075a5116-dbus-socket\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.201278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcdq\" (UniqueName: \"kubernetes.io/projected/f7bf5593-bd4f-462d-bcbf-319b075a5116-kube-api-access-8gcdq\") pod \"nmstate-handler-r6fjq\" (UID: \"f7bf5593-bd4f-462d-bcbf-319b075a5116\") " pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.213867 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.268264 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.268897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.269050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.270170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcd8e350-64e3-4a25-9bc5-cce4888da20a-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.274217 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd8e350-64e3-4a25-9bc5-cce4888da20a-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.278751 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.279545 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.296966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.297409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqnhf\" (UniqueName: \"kubernetes.io/projected/fcd8e350-64e3-4a25-9bc5-cce4888da20a-kube-api-access-wqnhf\") pod \"nmstate-console-plugin-5dcbbd79cf-5dzvp\" (UID: \"fcd8e350-64e3-4a25-9bc5-cce4888da20a\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.297110 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.354993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373417 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.373581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474915 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.474988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.475020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.475037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476701 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-oauth-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.476912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-service-ca\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.477114 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-trusted-ca-bundle\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.480617 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-serving-cert\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.480996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-console-oauth-config\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.484856 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-4d4ds"] Feb 27 00:20:33 crc kubenswrapper[4781]: W0227 00:20:33.488231 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b001223_04cf_4a45_843b_e62c5d13ac14.slice/crio-b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9 WatchSource:0}: Error finding container b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9: Status 404 returned error can't find the container with id b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9 Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.492443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggl6g\" (UniqueName: \"kubernetes.io/projected/8078e41d-b0b1-4f24-8d32-cd44b315f3e6-kube-api-access-ggl6g\") pod \"console-f69fbfd98-lv4mj\" (UID: \"8078e41d-b0b1-4f24-8d32-cd44b315f3e6\") " pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.545508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp"] Feb 27 00:20:33 crc kubenswrapper[4781]: W0227 00:20:33.556570 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd8e350_64e3_4a25_9bc5_cce4888da20a.slice/crio-489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b WatchSource:0}: Error finding container 489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b: Status 404 returned error can't find the container with id 489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.576406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.579433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/677ca1f7-513f-4de1-b64b-66b2524b82a1-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-vrv7p\" (UID: \"677ca1f7-513f-4de1-b64b-66b2524b82a1\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.608453 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.829506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.877778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"b92dec761f36b95ef2e423133c495dc82e43679e764f384170f4576e434e96a9"} Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.878511 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" event={"ID":"fcd8e350-64e3-4a25-9bc5-cce4888da20a","Type":"ContainerStarted","Data":"489cc6a2ba3df41687d3dbe738b2af0698b97d99c29e1c4e619a36d0c2ee734b"} Feb 27 00:20:33 crc kubenswrapper[4781]: I0227 00:20:33.879317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r6fjq" event={"ID":"f7bf5593-bd4f-462d-bcbf-319b075a5116","Type":"ContainerStarted","Data":"8ce1d847320d4b7a095d0231ec7bfe36b0b9de7e180d1647a4c7fd21430bb6da"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.036975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p"] Feb 27 00:20:34 crc kubenswrapper[4781]: W0227 00:20:34.040579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677ca1f7_513f_4de1_b64b_66b2524b82a1.slice/crio-4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a WatchSource:0}: Error finding container 4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a: Status 404 returned error can't find the container with id 4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a Feb 27 00:20:34 crc kubenswrapper[4781]: W0227 00:20:34.041501 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8078e41d_b0b1_4f24_8d32_cd44b315f3e6.slice/crio-5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f WatchSource:0}: Error finding container 5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f: Status 404 returned error can't find the container with id 5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.043350 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f69fbfd98-lv4mj"] Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.887714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f69fbfd98-lv4mj" event={"ID":"8078e41d-b0b1-4f24-8d32-cd44b315f3e6","Type":"ContainerStarted","Data":"f4fca42a15c68ede53a37518e14c977fbab2bebb7b36c506d72befc891c19f4e"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.888129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f69fbfd98-lv4mj" event={"ID":"8078e41d-b0b1-4f24-8d32-cd44b315f3e6","Type":"ContainerStarted","Data":"5160ef6ee4b95fd7e750137dc965b4d1c348383a01e2671c904b79e580408d0f"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.889198 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" event={"ID":"677ca1f7-513f-4de1-b64b-66b2524b82a1","Type":"ContainerStarted","Data":"4efe8519c899784ddd182c7c70d959a95caac65f6d28bb883fe155c5d9cfba5a"} Feb 27 00:20:34 crc kubenswrapper[4781]: I0227 00:20:34.915653 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f69fbfd98-lv4mj" podStartSLOduration=1.9156116600000002 podStartE2EDuration="1.91561166s" podCreationTimestamp="2026-02-27 00:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:20:34.911926716 +0000 UTC m=+904.169466280" watchObservedRunningTime="2026-02-27 00:20:34.91561166 +0000 UTC m=+904.173151234" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.906344 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" event={"ID":"677ca1f7-513f-4de1-b64b-66b2524b82a1","Type":"ContainerStarted","Data":"b24d00327247c18c40436086c4b92c8ada38f79188f28be675370d9b57271c6d"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.906749 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.908076 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"be9c6004dce2452f11b70ded74b4f9f553504775e50eaaf43d4aec4d14d75912"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.909807 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" event={"ID":"fcd8e350-64e3-4a25-9bc5-cce4888da20a","Type":"ContainerStarted","Data":"540e9e59ee52822899bd9318ccf25217de77c679afe2d9b6ecca061bb64062c4"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.911584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-r6fjq" event={"ID":"f7bf5593-bd4f-462d-bcbf-319b075a5116","Type":"ContainerStarted","Data":"7807686042fa2e86cafbd22dec2c45194cb7970593eea7565633b00292dd3860"} Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.911677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.921598 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" podStartSLOduration=2.43966489 podStartE2EDuration="5.921583263s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:34.044054954 +0000 UTC m=+903.301594518" lastFinishedPulling="2026-02-27 00:20:37.525973337 +0000 UTC m=+906.783512891" observedRunningTime="2026-02-27 00:20:37.920601868 +0000 UTC m=+907.178141422" watchObservedRunningTime="2026-02-27 00:20:37.921583263 +0000 UTC m=+907.179122827" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.937366 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-r6fjq" podStartSLOduration=1.771807468 podStartE2EDuration="5.937351014s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.354289695 +0000 UTC m=+902.611829249" lastFinishedPulling="2026-02-27 00:20:37.519833251 +0000 UTC m=+906.777372795" observedRunningTime="2026-02-27 00:20:37.936054501 +0000 UTC m=+907.193594075" watchObservedRunningTime="2026-02-27 00:20:37.937351014 +0000 UTC m=+907.194890568" Feb 27 00:20:37 crc kubenswrapper[4781]: I0227 00:20:37.957840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5dzvp" podStartSLOduration=1.014247806 podStartE2EDuration="4.957825415s" podCreationTimestamp="2026-02-27 00:20:33 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.558304445 +0000 UTC m=+902.815843999" lastFinishedPulling="2026-02-27 00:20:37.501882054 +0000 UTC m=+906.759421608" observedRunningTime="2026-02-27 00:20:37.957360093 +0000 UTC m=+907.214899647" watchObservedRunningTime="2026-02-27 00:20:37.957825415 +0000 UTC m=+907.215364969" Feb 27 00:20:40 crc kubenswrapper[4781]: I0227 00:20:40.935726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" event={"ID":"2b001223-04cf-4a45-843b-e62c5d13ac14","Type":"ContainerStarted","Data":"af3d0961a88bd8aa4970ea6c83db478faadeb698c987a49aa383d178c8476d52"} Feb 27 00:20:40 crc kubenswrapper[4781]: I0227 00:20:40.953097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-4d4ds" podStartSLOduration=2.201980102 podStartE2EDuration="8.953074485s" podCreationTimestamp="2026-02-27 00:20:32 +0000 UTC" firstStartedPulling="2026-02-27 00:20:33.496059351 +0000 UTC m=+902.753598905" lastFinishedPulling="2026-02-27 00:20:40.247153734 +0000 UTC m=+909.504693288" observedRunningTime="2026-02-27 00:20:40.951756132 +0000 UTC m=+910.209295726" watchObservedRunningTime="2026-02-27 00:20:40.953074485 +0000 UTC m=+910.210614079" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896026 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896398 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.896460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.897302 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:20:42 crc kubenswrapper[4781]: I0227 00:20:42.897396 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" gracePeriod=600 Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.322572 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-r6fjq" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.609711 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.610110 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.617347 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971239 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" exitCode=0 Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a"} Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.971512 4781 scope.go:117] "RemoveContainer" containerID="98d9908780c17a21a4a701f7c994bde3e3fbb6ea911f1b4e11c3a27ce7db4d1d" Feb 27 00:20:43 crc kubenswrapper[4781]: I0227 00:20:43.984272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f69fbfd98-lv4mj" Feb 27 00:20:44 crc kubenswrapper[4781]: I0227 00:20:44.052003 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:20:48 crc kubenswrapper[4781]: I0227 00:20:48.967659 4781 scope.go:117] "RemoveContainer" containerID="7ea50ff483bc5e473c8ac4484b625c2d3aca274594f654dad11472e0c517581a" Feb 27 00:20:53 crc kubenswrapper[4781]: I0227 00:20:53.839473 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-vrv7p" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.414484 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.416317 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.418015 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.421020 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.504744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.505094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.505311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.606767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.607438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.628983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.730139 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:08 crc kubenswrapper[4781]: I0227 00:21:08.993451 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv"] Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.094316 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vtsxv" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" containerID="cri-o://ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" gracePeriod=15 Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.169719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerStarted","Data":"0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33"} Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.170054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerStarted","Data":"a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f"} Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.371437 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vtsxv_76705148-274c-4428-9508-13fe1193646e/console/0.log" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.371499 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416436 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.416549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") pod \"76705148-274c-4428-9508-13fe1193646e\" (UID: \"76705148-274c-4428-9508-13fe1193646e\") " Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417245 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config" (OuterVolumeSpecName: "console-config") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.417292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca" (OuterVolumeSpecName: "service-ca") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.421474 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.421723 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.422313 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq" (OuterVolumeSpecName: "kube-api-access-xjdbq") pod "76705148-274c-4428-9508-13fe1193646e" (UID: "76705148-274c-4428-9508-13fe1193646e"). InnerVolumeSpecName "kube-api-access-xjdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518237 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518273 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518282 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518312 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518321 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76705148-274c-4428-9508-13fe1193646e-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518328 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76705148-274c-4428-9508-13fe1193646e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:09 crc kubenswrapper[4781]: I0227 00:21:09.518336 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjdbq\" (UniqueName: \"kubernetes.io/projected/76705148-274c-4428-9508-13fe1193646e-kube-api-access-xjdbq\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.178204 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33" exitCode=0 Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.178375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"0ddb319720490bd564d819eecab4446118317263411f03b09128eef25f9ffd33"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.180255 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181672 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vtsxv_76705148-274c-4428-9508-13fe1193646e/console/0.log" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181702 4781 generic.go:334] "Generic (PLEG): container finished" podID="76705148-274c-4428-9508-13fe1193646e" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" exitCode=2 Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerDied","Data":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vtsxv" event={"ID":"76705148-274c-4428-9508-13fe1193646e","Type":"ContainerDied","Data":"52e8848cb853a0dc3b72ab7abe99678676a1a3484d971d2212d9dc7e0814de5c"} Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181768 4781 scope.go:117] "RemoveContainer" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.181854 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vtsxv" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.219128 4781 scope.go:117] "RemoveContainer" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: E0227 00:21:10.219938 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": container with ID starting with ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d not found: ID does not exist" containerID="ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.219975 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d"} err="failed to get container status \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": rpc error: code = NotFound desc = could not find container \"ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d\": container with ID starting with ad0786127f59b73fe3c925c0b6ad95bb559699fabe3ed584f4d314fe68a7865d not found: ID does not exist" Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.236241 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:21:10 crc kubenswrapper[4781]: I0227 00:21:10.240475 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vtsxv"] Feb 27 00:21:11 crc kubenswrapper[4781]: I0227 00:21:11.316054 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76705148-274c-4428-9508-13fe1193646e" path="/var/lib/kubelet/pods/76705148-274c-4428-9508-13fe1193646e/volumes" Feb 27 00:21:12 crc kubenswrapper[4781]: E0227 00:21:12.137261 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2112f4cb_1229_4856_b3ec_a882e6fba5a6.slice/crio-adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2112f4cb_1229_4856_b3ec_a882e6fba5a6.slice/crio-conmon-adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:21:12 crc kubenswrapper[4781]: I0227 00:21:12.201405 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4" exitCode=0 Feb 27 00:21:12 crc kubenswrapper[4781]: I0227 00:21:12.201489 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"adb86b91ec03d7d8d5d9703cdb25703d1f3fd14c5f579236b7989c975ab0b1d4"} Feb 27 00:21:13 crc kubenswrapper[4781]: I0227 00:21:13.211223 4781 generic.go:334] "Generic (PLEG): container finished" podID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerID="95d543c78d543c8b4990d35fd5e9ec57547440a0e5ce353d2b81c5ee11526c3c" exitCode=0 Feb 27 00:21:13 crc kubenswrapper[4781]: I0227 00:21:13.211339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"95d543c78d543c8b4990d35fd5e9ec57547440a0e5ce353d2b81c5ee11526c3c"} Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.494857 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585413 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.585486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") pod \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\" (UID: \"2112f4cb-1229-4856-b3ec-a882e6fba5a6\") " Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.586604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle" (OuterVolumeSpecName: "bundle") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.590108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk" (OuterVolumeSpecName: "kube-api-access-788sk") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "kube-api-access-788sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.686957 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.686986 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788sk\" (UniqueName: \"kubernetes.io/projected/2112f4cb-1229-4856-b3ec-a882e6fba5a6-kube-api-access-788sk\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.795852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util" (OuterVolumeSpecName: "util") pod "2112f4cb-1229-4856-b3ec-a882e6fba5a6" (UID: "2112f4cb-1229-4856-b3ec-a882e6fba5a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:21:14 crc kubenswrapper[4781]: I0227 00:21:14.890622 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2112f4cb-1229-4856-b3ec-a882e6fba5a6-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" event={"ID":"2112f4cb-1229-4856-b3ec-a882e6fba5a6","Type":"ContainerDied","Data":"a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f"} Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228231 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a030052635f8b40a9e51e27e5e1179137be1981e6c214ce14624ac6e58bdb42f" Feb 27 00:21:15 crc kubenswrapper[4781]: I0227 00:21:15.228288 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910071 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910704 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910715 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910732 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="pull" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910738 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="pull" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910750 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910758 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: E0227 00:21:22.910773 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="util" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910779 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="util" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910873 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="76705148-274c-4428-9508-13fe1193646e" containerName="console" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.910886 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2112f4cb-1229-4856-b3ec-a882e6fba5a6" containerName="extract" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.911242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.913025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.913488 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.914004 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.914322 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.915056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ggqth" Feb 27 00:21:22 crc kubenswrapper[4781]: I0227 00:21:22.928990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.014429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.115322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.137484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-apiservice-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.137486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7020f39f-9738-4625-bd18-e5e4e64f5956-webhook-cert\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.139916 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjzsx\" (UniqueName: \"kubernetes.io/projected/7020f39f-9738-4625-bd18-e5e4e64f5956-kube-api-access-fjzsx\") pod \"metallb-operator-controller-manager-7586d66d7b-59ntk\" (UID: \"7020f39f-9738-4625-bd18-e5e4e64f5956\") " pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.145127 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.146007 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.150650 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wfgd6" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.150881 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.151422 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.157265 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.216453 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.227023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.317532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.324817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-apiservice-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.334878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc2d6f99-bd3f-44e8-91fc-6865285089e7-webhook-cert\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.335277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9sxk\" (UniqueName: \"kubernetes.io/projected/fc2d6f99-bd3f-44e8-91fc-6865285089e7-kube-api-access-r9sxk\") pod \"metallb-operator-webhook-server-58cbff967-5sp8v\" (UID: \"fc2d6f99-bd3f-44e8-91fc-6865285089e7\") " pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.472918 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.668890 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk"] Feb 27 00:21:23 crc kubenswrapper[4781]: W0227 00:21:23.683172 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7020f39f_9738_4625_bd18_e5e4e64f5956.slice/crio-774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172 WatchSource:0}: Error finding container 774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172: Status 404 returned error can't find the container with id 774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172 Feb 27 00:21:23 crc kubenswrapper[4781]: I0227 00:21:23.780058 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v"] Feb 27 00:21:23 crc kubenswrapper[4781]: W0227 00:21:23.781367 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc2d6f99_bd3f_44e8_91fc_6865285089e7.slice/crio-09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7 WatchSource:0}: Error finding container 09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7: Status 404 returned error can't find the container with id 09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7 Feb 27 00:21:24 crc kubenswrapper[4781]: I0227 00:21:24.662610 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" event={"ID":"fc2d6f99-bd3f-44e8-91fc-6865285089e7","Type":"ContainerStarted","Data":"09876a7185cb41db89ce204b51999b41bbfe3bf4efacd45611c321f31d8985f7"} Feb 27 00:21:24 crc kubenswrapper[4781]: I0227 00:21:24.664171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" event={"ID":"7020f39f-9738-4625-bd18-e5e4e64f5956","Type":"ContainerStarted","Data":"774de6d7742a7d040ae61feaa4484837b3d9ed78423d8cd297d214bf8bdf3172"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.697917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" event={"ID":"fc2d6f99-bd3f-44e8-91fc-6865285089e7","Type":"ContainerStarted","Data":"e4bc153d14b08884f0e2c27fe60ed230bdb4b4f20cac1e6f22273c11caf74a2d"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.698705 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.705066 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" event={"ID":"7020f39f-9738-4625-bd18-e5e4e64f5956","Type":"ContainerStarted","Data":"ad28ac70dc1334490f5a6920c6bb8dc290a7c8df00df805e4638c4994f1eb331"} Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.705477 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.746547 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" podStartSLOduration=1.183455584 podStartE2EDuration="6.74652652s" podCreationTimestamp="2026-02-27 00:21:23 +0000 UTC" firstStartedPulling="2026-02-27 00:21:23.784705098 +0000 UTC m=+953.042244652" lastFinishedPulling="2026-02-27 00:21:29.347776034 +0000 UTC m=+958.605315588" observedRunningTime="2026-02-27 00:21:29.739115951 +0000 UTC m=+958.996655525" watchObservedRunningTime="2026-02-27 00:21:29.74652652 +0000 UTC m=+959.004066074" Feb 27 00:21:29 crc kubenswrapper[4781]: I0227 00:21:29.759712 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" podStartSLOduration=2.115557466 podStartE2EDuration="7.759691164s" podCreationTimestamp="2026-02-27 00:21:22 +0000 UTC" firstStartedPulling="2026-02-27 00:21:23.689299941 +0000 UTC m=+952.946839495" lastFinishedPulling="2026-02-27 00:21:29.333433639 +0000 UTC m=+958.590973193" observedRunningTime="2026-02-27 00:21:29.758227867 +0000 UTC m=+959.015767501" watchObservedRunningTime="2026-02-27 00:21:29.759691164 +0000 UTC m=+959.017230718" Feb 27 00:21:43 crc kubenswrapper[4781]: I0227 00:21:43.477899 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-58cbff967-5sp8v" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.129646 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.130951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133099 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133197 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.133278 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.140603 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.225812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.326573 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.345423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"auto-csr-approver-29535862-l9vc5\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.450169 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.866900 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:22:00 crc kubenswrapper[4781]: I0227 00:22:00.931577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerStarted","Data":"a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1"} Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.303333 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.305737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.329115 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.458140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.559755 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.560382 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.560888 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.561100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.561535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.578584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"community-operators-tcfvp\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.624431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.944281 4781 generic.go:334] "Generic (PLEG): container finished" podID="411dc0f9-584c-453b-a137-189ab8731570" containerID="576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c" exitCode=0 Feb 27 00:22:02 crc kubenswrapper[4781]: I0227 00:22:02.944438 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerDied","Data":"576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.137593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:03 crc kubenswrapper[4781]: W0227 00:22:03.141967 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd292517f_9d33_4590_beae_e0810b1395fa.slice/crio-dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16 WatchSource:0}: Error finding container dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16: Status 404 returned error can't find the container with id dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16 Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.229418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7586d66d7b-59ntk" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955115 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679" exitCode=0 Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.955481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16"} Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.968207 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-j2n85"] Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.971406 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977384 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977590 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.977609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kk2xt" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.989355 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.990279 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:03 crc kubenswrapper[4781]: I0227 00:22:03.997469 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.000236 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.063657 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tljmv"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.064579 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.072698 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.072808 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.073466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jlhm7" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.073586 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087193 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087330 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087422 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.087478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.090679 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.092005 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.096806 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.111525 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.197409 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198558 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198602 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198704 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-metrics\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198878 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198926 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.198953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.200586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-conf\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.201497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43006307-3a88-4e83-b57f-965df4bd043d-frr-startup\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.201820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-reloader\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.202325 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43006307-3a88-4e83-b57f-965df4bd043d-frr-sockets\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.207134 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43006307-3a88-4e83-b57f-965df4bd043d-metrics-certs\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.218351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cpn\" (UniqueName: \"kubernetes.io/projected/43006307-3a88-4e83-b57f-965df4bd043d-kube-api-access-c6cpn\") pod \"frr-k8s-j2n85\" (UID: \"43006307-3a88-4e83-b57f-965df4bd043d\") " pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.220478 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4k97\" (UniqueName: \"kubernetes.io/projected/31409f77-5542-4376-8d77-c7a018b245b7-kube-api-access-r4k97\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.226850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31409f77-5542-4376-8d77-c7a018b245b7-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cqkgx\" (UID: \"31409f77-5542-4376-8d77-c7a018b245b7\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.287329 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.293977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.300799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.300797 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.300892 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist podName:5d7e20ea-c069-4c29-9c3f-1ac3404f026c nodeName:}" failed. No retries permitted until 2026-02-27 00:22:04.800871751 +0000 UTC m=+994.058411305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist") pod "speaker-tljmv" (UID: "5d7e20ea-c069-4c29-9c3f-1ac3404f026c") : secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.301568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metallb-excludel2\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.303669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.304878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-metrics-certs\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.305399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-metrics-certs\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.312580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.318945 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6f679c-913d-4851-b69d-a2e26ebf450a-cert\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.320668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vx4c\" (UniqueName: \"kubernetes.io/projected/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-kube-api-access-5vx4c\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.323387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5fg\" (UniqueName: \"kubernetes.io/projected/dc6f679c-913d-4851-b69d-a2e26ebf450a-kube-api-access-qf5fg\") pod \"controller-86ddb6bd46-c6m2v\" (UID: \"dc6f679c-913d-4851-b69d-a2e26ebf450a\") " pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.402060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") pod \"411dc0f9-584c-453b-a137-189ab8731570\" (UID: \"411dc0f9-584c-453b-a137-189ab8731570\") " Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.406081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc" (OuterVolumeSpecName: "kube-api-access-jmtbc") pod "411dc0f9-584c-453b-a137-189ab8731570" (UID: "411dc0f9-584c-453b-a137-189ab8731570"). InnerVolumeSpecName "kube-api-access-jmtbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.413006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.503656 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmtbc\" (UniqueName: \"kubernetes.io/projected/411dc0f9-584c-453b-a137-189ab8731570-kube-api-access-jmtbc\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.645942 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-c6m2v"] Feb 27 00:22:04 crc kubenswrapper[4781]: W0227 00:22:04.657049 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6f679c_913d_4851_b69d_a2e26ebf450a.slice/crio-a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335 WatchSource:0}: Error finding container a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335: Status 404 returned error can't find the container with id a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335 Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.754574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx"] Feb 27 00:22:04 crc kubenswrapper[4781]: W0227 00:22:04.759880 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31409f77_5542_4376_8d77_c7a018b245b7.slice/crio-162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d WatchSource:0}: Error finding container 162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d: Status 404 returned error can't find the container with id 162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.808211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.808408 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: E0227 00:22:04.808826 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist podName:5d7e20ea-c069-4c29-9c3f-1ac3404f026c nodeName:}" failed. No retries permitted until 2026-02-27 00:22:05.808799695 +0000 UTC m=+995.066339249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist") pod "speaker-tljmv" (UID: "5d7e20ea-c069-4c29-9c3f-1ac3404f026c") : secret "metallb-memberlist" not found Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.963260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.964707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" event={"ID":"31409f77-5542-4376-8d77-c7a018b245b7","Type":"ContainerStarted","Data":"162d067f06ee8ce69c09b9a544e249ca9c866a8714079288e2bbeaaa36bf057d"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" event={"ID":"411dc0f9-584c-453b-a137-189ab8731570","Type":"ContainerDied","Data":"a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966291 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3bf3c618ee2624440aeefa3c46dec475d48ce699c73b107c9ab38efd57223c1" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.966312 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535862-l9vc5" Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.972587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"f908361cb936292eea5614d03642ac94b248356b5031911f9f7dce351a864876"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"0ae361d3d561edda2bf7a30eef558fd7d3b7a1392f13785b2fa632ebb62dc787"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"fae3bf950a751276dea979e227c7699208eb922107b0f38fc5f75c0f692d9879"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.973781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-c6m2v" event={"ID":"dc6f679c-913d-4851-b69d-a2e26ebf450a","Type":"ContainerStarted","Data":"a16333594994dc88f16db6f7a76d6fff70645922ee4b477ccd2ef2ef339da335"} Feb 27 00:22:04 crc kubenswrapper[4781]: I0227 00:22:04.974381 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.007432 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-c6m2v" podStartSLOduration=1.007415136 podStartE2EDuration="1.007415136s" podCreationTimestamp="2026-02-27 00:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:22:05.005686181 +0000 UTC m=+994.263225745" watchObservedRunningTime="2026-02-27 00:22:05.007415136 +0000 UTC m=+994.264954690" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.335575 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.339437 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535856-mznwl"] Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.819980 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.825859 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5d7e20ea-c069-4c29-9c3f-1ac3404f026c-memberlist\") pod \"speaker-tljmv\" (UID: \"5d7e20ea-c069-4c29-9c3f-1ac3404f026c\") " pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.898597 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tljmv" Feb 27 00:22:05 crc kubenswrapper[4781]: W0227 00:22:05.934215 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7e20ea_c069_4c29_9c3f_1ac3404f026c.slice/crio-0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d WatchSource:0}: Error finding container 0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d: Status 404 returned error can't find the container with id 0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.983255 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"0fdbf8c09378b719500e9ccdad3108a428c9aad7a3d7ef576fe7190bcf78487d"} Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.986949 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac" exitCode=0 Feb 27 00:22:05 crc kubenswrapper[4781]: I0227 00:22:05.987002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:06.998877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"1adaf3191383877a7fb7daba3e2b41a4a95c8c9c9e58d88bdc661ccf7ec02d62"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:06.999175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tljmv" event={"ID":"5d7e20ea-c069-4c29-9c3f-1ac3404f026c","Type":"ContainerStarted","Data":"cc53f5a28d5cd6ff49f502e5ea0f9d76e4d5127e689b200232c8930f8cc03f26"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.000002 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tljmv" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.009189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerStarted","Data":"d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a"} Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.020385 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tljmv" podStartSLOduration=3.020367204 podStartE2EDuration="3.020367204s" podCreationTimestamp="2026-02-27 00:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:22:07.017456388 +0000 UTC m=+996.274995942" watchObservedRunningTime="2026-02-27 00:22:07.020367204 +0000 UTC m=+996.277906758" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.040672 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tcfvp" podStartSLOduration=2.6171347750000002 podStartE2EDuration="5.040655654s" podCreationTimestamp="2026-02-27 00:22:02 +0000 UTC" firstStartedPulling="2026-02-27 00:22:03.957775444 +0000 UTC m=+993.215315018" lastFinishedPulling="2026-02-27 00:22:06.381296343 +0000 UTC m=+995.638835897" observedRunningTime="2026-02-27 00:22:07.039872034 +0000 UTC m=+996.297411578" watchObservedRunningTime="2026-02-27 00:22:07.040655654 +0000 UTC m=+996.298195208" Feb 27 00:22:07 crc kubenswrapper[4781]: I0227 00:22:07.319398 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="778d83b2-2e0c-45b3-a296-aaba355c6427" path="/var/lib/kubelet/pods/778d83b2-2e0c-45b3-a296-aaba355c6427/volumes" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.089474 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:09 crc kubenswrapper[4781]: E0227 00:22:09.090057 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.090074 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.090214 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="411dc0f9-584c-453b-a137-189ab8731570" containerName="oc" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.091212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.106287 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265235 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.265303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.366901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.366958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367521 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.367524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.394391 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"certified-operators-k995x\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.418510 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:09 crc kubenswrapper[4781]: I0227 00:22:09.714090 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044061 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" exitCode=0 Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed"} Feb 27 00:22:10 crc kubenswrapper[4781]: I0227 00:22:10.044390 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"08252891894ad181fdad10de247b6b64df19ad9963bf55e86249173eed72e1a9"} Feb 27 00:22:11 crc kubenswrapper[4781]: I0227 00:22:11.059175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.066443 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" exitCode=0 Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.067212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.624763 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.625064 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:12 crc kubenswrapper[4781]: I0227 00:22:12.668965 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:13 crc kubenswrapper[4781]: I0227 00:22:13.126608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:14 crc kubenswrapper[4781]: I0227 00:22:14.419771 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-c6m2v" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.092979 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerStarted","Data":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.094902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" event={"ID":"31409f77-5542-4376-8d77-c7a018b245b7","Type":"ContainerStarted","Data":"9479c6a31a508ff24ffdf215a0552d8503f7f28a59e569d367d9eca78442dc30"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.095007 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.098171 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="7d1a97274b38bf99a1ed86190d4f1d99c2588ba1ecd252177f8d8bb041cc8621" exitCode=0 Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.098309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"7d1a97274b38bf99a1ed86190d4f1d99c2588ba1ecd252177f8d8bb041cc8621"} Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.117149 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k995x" podStartSLOduration=1.83005375 podStartE2EDuration="6.117122111s" podCreationTimestamp="2026-02-27 00:22:09 +0000 UTC" firstStartedPulling="2026-02-27 00:22:10.047005615 +0000 UTC m=+999.304545169" lastFinishedPulling="2026-02-27 00:22:14.334073976 +0000 UTC m=+1003.591613530" observedRunningTime="2026-02-27 00:22:15.111012131 +0000 UTC m=+1004.368551725" watchObservedRunningTime="2026-02-27 00:22:15.117122111 +0000 UTC m=+1004.374661695" Feb 27 00:22:15 crc kubenswrapper[4781]: I0227 00:22:15.131316 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" podStartSLOduration=2.809616932 podStartE2EDuration="12.131296612s" podCreationTimestamp="2026-02-27 00:22:03 +0000 UTC" firstStartedPulling="2026-02-27 00:22:04.762024593 +0000 UTC m=+994.019564147" lastFinishedPulling="2026-02-27 00:22:14.083704273 +0000 UTC m=+1003.341243827" observedRunningTime="2026-02-27 00:22:15.127115432 +0000 UTC m=+1004.384654996" watchObservedRunningTime="2026-02-27 00:22:15.131296612 +0000 UTC m=+1004.388836186" Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.107763 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="6b7b7dab1857c3218b173c7757535f1d9dc87f018d0d0499b560386781f0d9cd" exitCode=0 Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.107821 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"6b7b7dab1857c3218b173c7757535f1d9dc87f018d0d0499b560386781f0d9cd"} Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.286204 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:16 crc kubenswrapper[4781]: I0227 00:22:16.286447 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tcfvp" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" containerID="cri-o://d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" gracePeriod=2 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.121766 4781 generic.go:334] "Generic (PLEG): container finished" podID="d292517f-9d33-4590-beae-e0810b1395fa" containerID="d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" exitCode=0 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.121824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a"} Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.125231 4781 generic.go:334] "Generic (PLEG): container finished" podID="43006307-3a88-4e83-b57f-965df4bd043d" containerID="49c494c22b05665d2a24ac9ebba393c2d522d32048b7482eb856415ae96d68de" exitCode=0 Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.125274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerDied","Data":"49c494c22b05665d2a24ac9ebba393c2d522d32048b7482eb856415ae96d68de"} Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.232803 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388572 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.388781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") pod \"d292517f-9d33-4590-beae-e0810b1395fa\" (UID: \"d292517f-9d33-4590-beae-e0810b1395fa\") " Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.389648 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities" (OuterVolumeSpecName: "utilities") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.390866 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.393551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh" (OuterVolumeSpecName: "kube-api-access-vvznh") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "kube-api-access-vvznh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.444485 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d292517f-9d33-4590-beae-e0810b1395fa" (UID: "d292517f-9d33-4590-beae-e0810b1395fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.492580 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d292517f-9d33-4590-beae-e0810b1395fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:17 crc kubenswrapper[4781]: I0227 00:22:17.492614 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvznh\" (UniqueName: \"kubernetes.io/projected/d292517f-9d33-4590-beae-e0810b1395fa-kube-api-access-vvznh\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.139650 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcfvp" event={"ID":"d292517f-9d33-4590-beae-e0810b1395fa","Type":"ContainerDied","Data":"dc30d05e6e59ba685bb93b7eed25462156f94fb18da11a6113fc7887bdfc1c16"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.139903 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcfvp" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.140041 4781 scope.go:117] "RemoveContainer" containerID="d2674464663697b7cd845ec2becb3f87c0311300123fbaaf0825ff61fcdf9b0a" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"1d11fbc8a61aed2b70be8a828981dba857b2c5fbe461635c23f7a2280364a7db"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"7ade9677e2a5a2798e3aec6259047a7e1fbde86fb818094e3dd5ae03401a632e"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"d1e73566326443b55e16e9d978011a7707f5d6e1564e3c3885cba67b8851e546"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"f16864687f756288835edd62a7293c94dff4d226ec5d0cdeeb0921a0f32f390a"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"72be0a56ed19ea154914cc7c6eaec94dceae7b58953d578ffd4f3168750b2fbc"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j2n85" event={"ID":"43006307-3a88-4e83-b57f-965df4bd043d","Type":"ContainerStarted","Data":"818f97d6403f9ac22e319204b84e8821cec8a69a3658ea16a5bd1db2eda8b677"} Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.145273 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.168245 4781 scope.go:117] "RemoveContainer" containerID="fbb42f4e19a7208fb759ce636689c17c2b7bc5e5f12afafb5c11f053e74607ac" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.200709 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-j2n85" podStartSLOduration=5.640312921 podStartE2EDuration="15.200688029s" podCreationTimestamp="2026-02-27 00:22:03 +0000 UTC" firstStartedPulling="2026-02-27 00:22:04.524180947 +0000 UTC m=+993.781720501" lastFinishedPulling="2026-02-27 00:22:14.084556065 +0000 UTC m=+1003.342095609" observedRunningTime="2026-02-27 00:22:18.181814836 +0000 UTC m=+1007.439354420" watchObservedRunningTime="2026-02-27 00:22:18.200688029 +0000 UTC m=+1007.458227593" Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.203353 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.221602 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tcfvp"] Feb 27 00:22:18 crc kubenswrapper[4781]: I0227 00:22:18.227501 4781 scope.go:117] "RemoveContainer" containerID="3f48d77562b78b7c4eb9406b5bae70799989c1e6ce32c4c8dfce12eef51c7679" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.294760 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.319434 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d292517f-9d33-4590-beae-e0810b1395fa" path="/var/lib/kubelet/pods/d292517f-9d33-4590-beae-e0810b1395fa/volumes" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.333684 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.418871 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.418925 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:19 crc kubenswrapper[4781]: I0227 00:22:19.458845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:20 crc kubenswrapper[4781]: I0227 00:22:20.212263 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:21 crc kubenswrapper[4781]: I0227 00:22:21.482281 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.182577 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k995x" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" containerID="cri-o://3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" gracePeriod=2 Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.611485 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764828 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.764890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") pod \"586212ca-1380-4fea-a2f1-105fc30f56e3\" (UID: \"586212ca-1380-4fea-a2f1-105fc30f56e3\") " Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.766266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities" (OuterVolumeSpecName: "utilities") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.789862 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz" (OuterVolumeSpecName: "kube-api-access-4jvpz") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "kube-api-access-4jvpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.828615 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "586212ca-1380-4fea-a2f1-105fc30f56e3" (UID: "586212ca-1380-4fea-a2f1-105fc30f56e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866654 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866691 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvpz\" (UniqueName: \"kubernetes.io/projected/586212ca-1380-4fea-a2f1-105fc30f56e3-kube-api-access-4jvpz\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:22 crc kubenswrapper[4781]: I0227 00:22:22.866701 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586212ca-1380-4fea-a2f1-105fc30f56e3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193130 4781 generic.go:334] "Generic (PLEG): container finished" podID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" exitCode=0 Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193205 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k995x" event={"ID":"586212ca-1380-4fea-a2f1-105fc30f56e3","Type":"ContainerDied","Data":"08252891894ad181fdad10de247b6b64df19ad9963bf55e86249173eed72e1a9"} Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193221 4781 scope.go:117] "RemoveContainer" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.193402 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k995x" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.229169 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.229243 4781 scope.go:117] "RemoveContainer" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.234545 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k995x"] Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.247965 4781 scope.go:117] "RemoveContainer" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277350 4781 scope.go:117] "RemoveContainer" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.277756 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": container with ID starting with 3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98 not found: ID does not exist" containerID="3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277808 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98"} err="failed to get container status \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": rpc error: code = NotFound desc = could not find container \"3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98\": container with ID starting with 3ec6cb644e4e7f5e7912db39c5819e0906af9f525a6bdefb01ec3bcd9cc04b98 not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.277830 4781 scope.go:117] "RemoveContainer" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.278084 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": container with ID starting with bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6 not found: ID does not exist" containerID="bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278104 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6"} err="failed to get container status \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": rpc error: code = NotFound desc = could not find container \"bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6\": container with ID starting with bc8dcc00708b2955324ae604b7e2f739c3a87b6da068a7363051cd49cea954b6 not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278137 4781 scope.go:117] "RemoveContainer" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: E0227 00:22:23.278499 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": container with ID starting with 54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed not found: ID does not exist" containerID="54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.278539 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed"} err="failed to get container status \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": rpc error: code = NotFound desc = could not find container \"54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed\": container with ID starting with 54c0d7546be3f43a1d764e4f77f6f61486b1feb8e9e99b36db6fda04741816ed not found: ID does not exist" Feb 27 00:22:23 crc kubenswrapper[4781]: I0227 00:22:23.317377 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" path="/var/lib/kubelet/pods/586212ca-1380-4fea-a2f1-105fc30f56e3/volumes" Feb 27 00:22:24 crc kubenswrapper[4781]: I0227 00:22:24.317957 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cqkgx" Feb 27 00:22:25 crc kubenswrapper[4781]: I0227 00:22:25.903552 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tljmv" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.091690 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092187 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092198 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092218 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092224 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092236 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092242 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092252 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092257 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="extract-content" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092268 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092273 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: E0227 00:22:32.092283 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092290 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="extract-utilities" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092396 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d292517f-9d33-4590-beae-e0810b1395fa" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.092404 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="586212ca-1380-4fea-a2f1-105fc30f56e3" containerName="registry-server" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.093414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.095838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-s7w9m" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.095981 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.096264 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.109558 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.193275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.295061 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.313784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hkr\" (UniqueName: \"kubernetes.io/projected/f66c974d-5687-42bd-9742-469922240fd5-kube-api-access-q6hkr\") pod \"openstack-operator-index-rrx6z\" (UID: \"f66c974d-5687-42bd-9742-469922240fd5\") " pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.424338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:32 crc kubenswrapper[4781]: I0227 00:22:32.817377 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrx6z"] Feb 27 00:22:33 crc kubenswrapper[4781]: I0227 00:22:33.268383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrx6z" event={"ID":"f66c974d-5687-42bd-9742-469922240fd5","Type":"ContainerStarted","Data":"233e724f10dca6bc8119540049620e5e753edba61bd421691e91b4d2fb68526b"} Feb 27 00:22:34 crc kubenswrapper[4781]: I0227 00:22:34.303599 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-j2n85" Feb 27 00:22:35 crc kubenswrapper[4781]: I0227 00:22:35.285253 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrx6z" event={"ID":"f66c974d-5687-42bd-9742-469922240fd5","Type":"ContainerStarted","Data":"6d247caa220fa7adc12fb8d3b113153200f3ad9a6c3899aa94ab78b37af649ff"} Feb 27 00:22:35 crc kubenswrapper[4781]: I0227 00:22:35.311002 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rrx6z" podStartSLOduration=1.082876801 podStartE2EDuration="3.310975763s" podCreationTimestamp="2026-02-27 00:22:32 +0000 UTC" firstStartedPulling="2026-02-27 00:22:32.829242593 +0000 UTC m=+1022.086782147" lastFinishedPulling="2026-02-27 00:22:35.057341555 +0000 UTC m=+1024.314881109" observedRunningTime="2026-02-27 00:22:35.302604834 +0000 UTC m=+1024.560144418" watchObservedRunningTime="2026-02-27 00:22:35.310975763 +0000 UTC m=+1024.568515347" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.695067 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.702370 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.709048 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.794675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.895939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.895995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.896553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:37 crc kubenswrapper[4781]: I0227 00:22:37.917684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"redhat-marketplace-xhpbl\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:38 crc kubenswrapper[4781]: I0227 00:22:38.020427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:38 crc kubenswrapper[4781]: I0227 00:22:38.531199 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:38 crc kubenswrapper[4781]: W0227 00:22:38.542606 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2595ff69_b633_443a_81ca_238982513cf4.slice/crio-d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b WatchSource:0}: Error finding container d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b: Status 404 returned error can't find the container with id d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.312052 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4" exitCode=0 Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.316323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4"} Feb 27 00:22:39 crc kubenswrapper[4781]: I0227 00:22:39.316353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerStarted","Data":"d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b"} Feb 27 00:22:40 crc kubenswrapper[4781]: I0227 00:22:40.320559 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa" exitCode=0 Feb 27 00:22:40 crc kubenswrapper[4781]: I0227 00:22:40.320602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa"} Feb 27 00:22:41 crc kubenswrapper[4781]: I0227 00:22:41.328702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerStarted","Data":"4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429"} Feb 27 00:22:41 crc kubenswrapper[4781]: I0227 00:22:41.342097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhpbl" podStartSLOduration=2.8702360000000002 podStartE2EDuration="4.342079107s" podCreationTimestamp="2026-02-27 00:22:37 +0000 UTC" firstStartedPulling="2026-02-27 00:22:39.314166327 +0000 UTC m=+1028.571705891" lastFinishedPulling="2026-02-27 00:22:40.786009444 +0000 UTC m=+1030.043548998" observedRunningTime="2026-02-27 00:22:41.342053436 +0000 UTC m=+1030.599592990" watchObservedRunningTime="2026-02-27 00:22:41.342079107 +0000 UTC m=+1030.599618651" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.425181 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.425233 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:42 crc kubenswrapper[4781]: I0227 00:22:42.451490 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:43 crc kubenswrapper[4781]: I0227 00:22:43.369349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rrx6z" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.545386 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.547951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.557612 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xcdg7" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.557714 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.696817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.697043 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.697329 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799463 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.799989 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.822891 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:45 crc kubenswrapper[4781]: I0227 00:22:45.870298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:46 crc kubenswrapper[4781]: I0227 00:22:46.342675 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b"] Feb 27 00:22:46 crc kubenswrapper[4781]: I0227 00:22:46.365348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerStarted","Data":"b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7"} Feb 27 00:22:47 crc kubenswrapper[4781]: I0227 00:22:47.372115 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="b25efba51952e564d3e981d032e4b63e9e54af39849e65b027cfac79467364c7" exitCode=0 Feb 27 00:22:47 crc kubenswrapper[4781]: I0227 00:22:47.372370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"b25efba51952e564d3e981d032e4b63e9e54af39849e65b027cfac79467364c7"} Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.020566 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.020676 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.058080 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.382860 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="20931fb0e24304c280dec3a74e5cb7c1a581d67c8b4c7fcf99bc0128ac8d0526" exitCode=0 Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.384834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"20931fb0e24304c280dec3a74e5cb7c1a581d67c8b4c7fcf99bc0128ac8d0526"} Feb 27 00:22:48 crc kubenswrapper[4781]: I0227 00:22:48.432740 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.067112 4781 scope.go:117] "RemoveContainer" containerID="96bd641ff5c28b0d487d9f55a81f55a83bc758e496b0e0a0d2639cc8d0b260d5" Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.391003 4781 generic.go:334] "Generic (PLEG): container finished" podID="343b5811-baf3-443e-a8fe-074f7b844d14" containerID="59a035570e9bebe7c38eff7b66207812fc41e675067fc11daaac1296a49c6bb2" exitCode=0 Feb 27 00:22:49 crc kubenswrapper[4781]: I0227 00:22:49.391130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"59a035570e9bebe7c38eff7b66207812fc41e675067fc11daaac1296a49c6bb2"} Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.711455 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867882 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.867933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") pod \"343b5811-baf3-443e-a8fe-074f7b844d14\" (UID: \"343b5811-baf3-443e-a8fe-074f7b844d14\") " Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.869037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle" (OuterVolumeSpecName: "bundle") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.875247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6" (OuterVolumeSpecName: "kube-api-access-sswq6") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "kube-api-access-sswq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.881096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util" (OuterVolumeSpecName: "util") pod "343b5811-baf3-443e-a8fe-074f7b844d14" (UID: "343b5811-baf3-443e-a8fe-074f7b844d14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.969877 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sswq6\" (UniqueName: \"kubernetes.io/projected/343b5811-baf3-443e-a8fe-074f7b844d14-kube-api-access-sswq6\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.970385 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-util\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:50 crc kubenswrapper[4781]: I0227 00:22:50.970582 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343b5811-baf3-443e-a8fe-074f7b844d14-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" event={"ID":"343b5811-baf3-443e-a8fe-074f7b844d14","Type":"ContainerDied","Data":"b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7"} Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408136 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b" Feb 27 00:22:51 crc kubenswrapper[4781]: I0227 00:22:51.408151 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5cc513bb3039586b4f615fb8a649c5385cbc55e7cadc346698a4db8fb0b0af7" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.085292 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.085753 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xhpbl" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" containerID="cri-o://4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" gracePeriod=2 Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.420744 4781 generic.go:334] "Generic (PLEG): container finished" podID="2595ff69-b633-443a-81ca-238982513cf4" containerID="4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" exitCode=0 Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.421021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429"} Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.526686 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623586 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.623705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") pod \"2595ff69-b633-443a-81ca-238982513cf4\" (UID: \"2595ff69-b633-443a-81ca-238982513cf4\") " Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.624818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities" (OuterVolumeSpecName: "utilities") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.632846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p" (OuterVolumeSpecName: "kube-api-access-d9p6p") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "kube-api-access-d9p6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.659107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2595ff69-b633-443a-81ca-238982513cf4" (UID: "2595ff69-b633-443a-81ca-238982513cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724702 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724740 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9p6p\" (UniqueName: \"kubernetes.io/projected/2595ff69-b633-443a-81ca-238982513cf4-kube-api-access-d9p6p\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:52 crc kubenswrapper[4781]: I0227 00:22:52.724749 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2595ff69-b633-443a-81ca-238982513cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.429703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpbl" event={"ID":"2595ff69-b633-443a-81ca-238982513cf4","Type":"ContainerDied","Data":"d510acce79081bd256d287a8b249b7c90204000cf94cd34fe8e254caa90ba14b"} Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.429744 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpbl" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.430078 4781 scope.go:117] "RemoveContainer" containerID="4e7d81148b9a631f02e9b59a7880d37ff51e7b7fc6595218d4abbd94b9c5c429" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.468233 4781 scope.go:117] "RemoveContainer" containerID="3c9631be4e58fde1db46b99f19a1e6807b18aacae49000c04cc5603d62ba18fa" Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.479789 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.487863 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpbl"] Feb 27 00:22:53 crc kubenswrapper[4781]: I0227 00:22:53.497821 4781 scope.go:117] "RemoveContainer" containerID="72146b21e6a598f0d19679595e182fcde5a67855f58589e196415993f2a3b7f4" Feb 27 00:22:55 crc kubenswrapper[4781]: I0227 00:22:55.318108 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2595ff69-b633-443a-81ca-238982513cf4" path="/var/lib/kubelet/pods/2595ff69-b633-443a-81ca-238982513cf4/volumes" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.888827 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889561 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889589 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889615 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-utilities" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889658 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-utilities" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889692 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889705 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889729 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="pull" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889741 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="pull" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889756 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-content" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889770 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="extract-content" Feb 27 00:22:56 crc kubenswrapper[4781]: E0227 00:22:56.889790 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="util" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.889802 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="util" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890011 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="343b5811-baf3-443e-a8fe-074f7b844d14" containerName="extract" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890035 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2595ff69-b633-443a-81ca-238982513cf4" containerName="registry-server" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.890791 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.893723 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-49nbd" Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.958052 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:56 crc kubenswrapper[4781]: I0227 00:22:56.982527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.083787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.101573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzs9\" (UniqueName: \"kubernetes.io/projected/837579c4-87be-4ce8-94ff-bf25307562db-kube-api-access-hvzs9\") pod \"openstack-operator-controller-init-85cf9d4d7d-cl7rb\" (UID: \"837579c4-87be-4ce8-94ff-bf25307562db\") " pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.213910 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:22:57 crc kubenswrapper[4781]: I0227 00:22:57.772922 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb"] Feb 27 00:22:58 crc kubenswrapper[4781]: I0227 00:22:58.461838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" event={"ID":"837579c4-87be-4ce8-94ff-bf25307562db","Type":"ContainerStarted","Data":"cc33ba6650701b3cee9761acdb9a2ccc4cfb594b7a51d01c95ed5ecef8fd1322"} Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.493956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" event={"ID":"837579c4-87be-4ce8-94ff-bf25307562db","Type":"ContainerStarted","Data":"96a3c846f670e5eb6b260f2637021742a1d1d083d4c71b4991afdcf3a74c14ee"} Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.494431 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:23:02 crc kubenswrapper[4781]: I0227 00:23:02.526708 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" podStartSLOduration=2.69838488 podStartE2EDuration="6.526688472s" podCreationTimestamp="2026-02-27 00:22:56 +0000 UTC" firstStartedPulling="2026-02-27 00:22:57.785139702 +0000 UTC m=+1047.042679256" lastFinishedPulling="2026-02-27 00:23:01.613443284 +0000 UTC m=+1050.870982848" observedRunningTime="2026-02-27 00:23:02.522281777 +0000 UTC m=+1051.779821331" watchObservedRunningTime="2026-02-27 00:23:02.526688472 +0000 UTC m=+1051.784228026" Feb 27 00:23:07 crc kubenswrapper[4781]: I0227 00:23:07.217567 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-85cf9d4d7d-cl7rb" Feb 27 00:23:12 crc kubenswrapper[4781]: I0227 00:23:12.895369 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:23:12 crc kubenswrapper[4781]: I0227 00:23:12.895952 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.721795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.723104 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.725478 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dtmpw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.729820 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.730649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.738096 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nqnlf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.740404 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.741257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.744400 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-p57jl" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.749608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.753593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.776580 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.793605 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.794683 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.796156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2gkks" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.802141 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.802974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.808754 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4mv5w" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.821303 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822148 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.822817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.830733 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.831589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.841563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.843894 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.844166 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xzphw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.844297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dlhw4" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.849510 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.856589 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.857816 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.864480 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-whkbx" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.901694 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925602 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:27 crc kubenswrapper[4781]: I0227 00:23:27.925891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.006093 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tf8\" (UniqueName: \"kubernetes.io/projected/e4d59c4e-1fd2-43d9-8ac2-d162e746e758-kube-api-access-79tf8\") pod \"cinder-operator-controller-manager-55d77d7b5c-fb2wf\" (UID: \"e4d59c4e-1fd2-43d9-8ac2-d162e746e758\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.007996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jfx\" (UniqueName: \"kubernetes.io/projected/bd77d7fe-85fb-4b16-aa12-75359b52e139-kube-api-access-p4jfx\") pod \"designate-operator-controller-manager-6d8bf5c495-rn44b\" (UID: \"bd77d7fe-85fb-4b16-aa12-75359b52e139\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.008935 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlngn\" (UniqueName: \"kubernetes.io/projected/fe1f6a92-751f-417e-b2ff-694c10210db7-kube-api-access-jlngn\") pod \"barbican-operator-controller-manager-868647ff47-rfwpm\" (UID: \"fe1f6a92-751f-417e-b2ff-694c10210db7\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.012816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.017002 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.026677 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.027583 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028883 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.028948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.029467 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.029513 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:28.529496915 +0000 UTC m=+1077.787036469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.033573 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lldlj" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.045605 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.059301 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.074919 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.079362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.106588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr5p\" (UniqueName: \"kubernetes.io/projected/771a50fd-33f6-47ba-ac4a-46da5446cdd8-kube-api-access-vpr5p\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrds7\" (UniqueName: \"kubernetes.io/projected/6739bbb3-bf62-4b1d-8dd7-3accde691e66-kube-api-access-mrds7\") pod \"heat-operator-controller-manager-69f49c598c-nfzvw\" (UID: \"6739bbb3-bf62-4b1d-8dd7-3accde691e66\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww586\" (UniqueName: \"kubernetes.io/projected/c1807c06-6c68-477c-8725-5702e2d59c93-kube-api-access-ww586\") pod \"horizon-operator-controller-manager-5b9b8895d5-fmbwz\" (UID: \"c1807c06-6c68-477c-8725-5702e2d59c93\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2ws\" (UniqueName: \"kubernetes.io/projected/513da4ed-be63-45dd-a32a-27ac3ef443a5-kube-api-access-sj2ws\") pod \"ironic-operator-controller-manager-554564d7fc-szs2w\" (UID: \"513da4ed-be63-45dd-a32a-27ac3ef443a5\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.115853 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.116559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj2c\" (UniqueName: \"kubernetes.io/projected/66c995b3-f763-455e-8ea3-7dfdfb4c4301-kube-api-access-rnj2c\") pod \"glance-operator-controller-manager-784b5bb6c5-4gl88\" (UID: \"66c995b3-f763-455e-8ea3-7dfdfb4c4301\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.116598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.122050 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.131468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.135777 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.136569 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.136891 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tzjtp" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.139505 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-94jbq" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.152362 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.153954 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.183138 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.184020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.197979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2rnp9" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.199748 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.213120 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.220678 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.221564 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.229995 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-p8m82" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234050 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234092 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.234128 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.242844 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.272691 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.273571 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.284706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.299688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqq4t\" (UniqueName: \"kubernetes.io/projected/057d4c8d-606e-44ea-89ea-fb17b4d63733-kube-api-access-dqq4t\") pod \"keystone-operator-controller-manager-b4d948c87-2pgf6\" (UID: \"057d4c8d-606e-44ea-89ea-fb17b4d63733\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.300196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ml5q6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.335828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.339879 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.356131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.361667 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-blcnh" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.361785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.363061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp4z9\" (UniqueName: \"kubernetes.io/projected/f777df4b-1040-4f86-a816-ea778b9e5dc3-kube-api-access-fp4z9\") pod \"mariadb-operator-controller-manager-6994f66f48-w5wp5\" (UID: \"f777df4b-1040-4f86-a816-ea778b9e5dc3\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.366647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffzrf\" (UniqueName: \"kubernetes.io/projected/a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad-kube-api-access-ffzrf\") pod \"manila-operator-controller-manager-67d996989d-jnhdb\" (UID: \"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.370663 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.375613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.407779 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.408982 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.436745 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.437944 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.438305 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.450925 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tc89m" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.452567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.461811 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.462702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.468267 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bv9s7" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.468410 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.482059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p8s\" (UniqueName: \"kubernetes.io/projected/e9a3b900-688c-4043-b1ff-53ae1c3ee1d6-kube-api-access-78p8s\") pod \"nova-operator-controller-manager-567668f5cf-trb7t\" (UID: \"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.484246 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.485234 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.487588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxc2d\" (UniqueName: \"kubernetes.io/projected/fe25346c-5f31-478e-a639-060c5958b1eb-kube-api-access-vxc2d\") pod \"neutron-operator-controller-manager-6bd4687957-v5hwb\" (UID: \"fe25346c-5f31-478e-a639-060c5958b1eb\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.487859 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zrhpq" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.489902 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l84m\" (UniqueName: \"kubernetes.io/projected/7d5e1e13-5ce4-48ba-a8c9-3db924e63840-kube-api-access-6l84m\") pod \"octavia-operator-controller-manager-659dc6bbfc-tb298\" (UID: \"7d5e1e13-5ce4-48ba-a8c9-3db924e63840\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.495597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.499797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.523896 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.534977 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.535841 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.542933 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.544028 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6jtdz" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556458 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556515 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.556614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556619 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556733 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556752 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.056729114 +0000 UTC m=+1078.314268668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.556796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.556775826 +0000 UTC m=+1078.814315380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.561758 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.562654 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.586878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9s6\" (UniqueName: \"kubernetes.io/projected/83466be2-d230-4516-b594-ee56aae3c510-kube-api-access-qb9s6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.587156 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.650423 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.653866 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.656010 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-n4k5g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658586 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658627 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.658918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.659009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.662037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.676525 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.693840 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.694645 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrxh\" (UniqueName: \"kubernetes.io/projected/3747ddf8-799c-441c-bd9d-4450bdb72382-kube-api-access-6zrxh\") pod \"swift-operator-controller-manager-68f46476f-5mgl8\" (UID: \"3747ddf8-799c-441c-bd9d-4450bdb72382\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.695854 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8c68w" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.698588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.707423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvr4\" (UniqueName: \"kubernetes.io/projected/9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1-kube-api-access-4rvr4\") pod \"placement-operator-controller-manager-8497b45c89-rn2vt\" (UID: \"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.708092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4x8\" (UniqueName: \"kubernetes.io/projected/fae0f5f8-e721-4ef1-9c8f-4574f156913f-kube-api-access-7c4x8\") pod \"ovn-operator-controller-manager-5955d8c787-bvdd5\" (UID: \"fae0f5f8-e721-4ef1-9c8f-4574f156913f\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.710080 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.731605 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.739203 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.741008 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.741219 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j74ff" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.743729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761100 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.761145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.774907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.792930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.793997 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.798279 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.800464 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p6kf\" (UniqueName: \"kubernetes.io/projected/11361a5e-18c5-448a-8b07-8f5e3245f607-kube-api-access-7p6kf\") pod \"telemetry-operator-controller-manager-9d678b567-gttml\" (UID: \"11361a5e-18c5-448a-8b07-8f5e3245f607\") " pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.802150 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kg2v9" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.807865 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.862989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.863031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.892477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.903174 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9s8r\" (UniqueName: \"kubernetes.io/projected/cf1fe81a-282d-4e51-b8d9-d6569a640985-kube-api-access-p9s8r\") pod \"test-operator-controller-manager-5dc6794d5b-dc7k2\" (UID: \"cf1fe81a-282d-4e51-b8d9-d6569a640985\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.907860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.914199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfl55\" (UniqueName: \"kubernetes.io/projected/d31610db-32c1-4c99-9001-ab4504649a75-kube-api-access-sfl55\") pod \"watcher-operator-controller-manager-bccc79885-gs62l\" (UID: \"d31610db-32c1-4c99-9001-ab4504649a75\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966527 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.966660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.966799 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.966845 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.466830441 +0000 UTC m=+1078.724369995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.967084 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: E0227 00:23:28.967107 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:29.467100638 +0000 UTC m=+1078.724640192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.983661 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhd9z\" (UniqueName: \"kubernetes.io/projected/6d15395c-5ed9-43c8-b7f6-ac16e6e32e70-kube-api-access-fhd9z\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7gr7g\" (UID: \"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:28 crc kubenswrapper[4781]: I0227 00:23:28.985273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqv8\" (UniqueName: \"kubernetes.io/projected/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-kube-api-access-grqv8\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.019589 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.043067 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.069664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.069837 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.069880 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.069867834 +0000 UTC m=+1079.327407388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.075826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.081881 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.095264 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.104319 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.107837 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6739bbb3_bf62_4b1d_8dd7_3accde691e66.slice/crio-c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92 WatchSource:0}: Error finding container c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92: Status 404 returned error can't find the container with id c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92 Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.134517 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.288525 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.300593 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1807c06_6c68_477c_8725_5702e2d59c93.slice/crio-7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f WatchSource:0}: Error finding container 7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f: Status 404 returned error can't find the container with id 7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.321061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.324089 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd77d7fe_85fb_4b16_aa12_75359b52e139.slice/crio-b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc WatchSource:0}: Error finding container b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc: Status 404 returned error can't find the container with id b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.360132 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" event={"ID":"c1807c06-6c68-477c-8725-5702e2d59c93","Type":"ContainerStarted","Data":"7e627c843460bd7de182acf05ba1b7b604c4be0586d0a39818f12398564f268f"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.360929 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" event={"ID":"bd77d7fe-85fb-4b16-aa12-75359b52e139","Type":"ContainerStarted","Data":"b334f23af544bcdb23e245902b2a580ff297a1dc8c4d52da3bf273b3914fa6bc"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.361719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" event={"ID":"e4d59c4e-1fd2-43d9-8ac2-d162e746e758","Type":"ContainerStarted","Data":"97a6ec41c3ce1eedeab33905b51fdd25b1760cafc7074733bf3862ffde23fd61"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.362549 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" event={"ID":"6739bbb3-bf62-4b1d-8dd7-3accde691e66","Type":"ContainerStarted","Data":"c8d874c90ac7f39ebabac7500db8f41a011950198638aad7b783a07f6bdb6f92"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.363542 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" event={"ID":"fe1f6a92-751f-417e-b2ff-694c10210db7","Type":"ContainerStarted","Data":"1f4bbde81c9128842228ac366fd53d66674cc5b244ec0c9beaf14bc661503355"} Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.425975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.432689 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.434527 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe25346c_5f31_478e_a639_060c5958b1eb.slice/crio-58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef WatchSource:0}: Error finding container 58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef: Status 404 returned error can't find the container with id 58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.437991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.442513 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.447703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.475547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.475617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.475806 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.475855 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.475840554 +0000 UTC m=+1079.733380108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.477099 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.477163 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:30.477145958 +0000 UTC m=+1079.734685512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.577216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.577408 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.577483 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:31.57746345 +0000 UTC m=+1080.835003004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.637147 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.649792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.661800 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6l84m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-tb298_openstack-operators(7d5e1e13-5ce4-48ba-a8c9-3db924e63840): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.662968 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.665370 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.672991 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rvr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-rn2vt_openstack-operators(9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.683603 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.697581 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.701779 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fp4z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-w5wp5_openstack-operators(f777df4b-1040-4f86-a816-ea778b9e5dc3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.702953 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.705187 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5"] Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.752814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.756415 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae0f5f8_e721_4ef1_9c8f_4574f156913f.slice/crio-f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e WatchSource:0}: Error finding container f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e: Status 404 returned error can't find the container with id f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.770676 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l"] Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.774840 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31610db_32c1_4c99_9001_ab4504649a75.slice/crio-26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e WatchSource:0}: Error finding container 26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e: Status 404 returned error can't find the container with id 26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.777014 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfl55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-gs62l_openstack-operators(d31610db-32c1-4c99-9001-ab4504649a75): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.777098 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11361a5e_18c5_448a_8b07_8f5e3245f607.slice/crio-4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5 WatchSource:0}: Error finding container 4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5: Status 404 returned error can't find the container with id 4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5 Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.778700 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:29 crc kubenswrapper[4781]: W0227 00:23:29.779044 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1fe81a_282d_4e51_b8d9_d6569a640985.slice/crio-95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8 WatchSource:0}: Error finding container 95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8: Status 404 returned error can't find the container with id 95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8 Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.779104 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.780602 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7p6kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9d678b567-gttml_openstack-operators(11361a5e-18c5-448a-8b07-8f5e3245f607): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.781952 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.782395 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9s8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-dc7k2_openstack-operators(cf1fe81a-282d-4e51-b8d9-d6569a640985): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.784008 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.788402 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhd9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gr7g_openstack-operators(6d15395c-5ed9-43c8-b7f6-ac16e6e32e70): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.789724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g"] Feb 27 00:23:29 crc kubenswrapper[4781]: E0227 00:23:29.789978 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:29 crc kubenswrapper[4781]: I0227 00:23:29.795290 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2"] Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.098023 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.098521 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.098581 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.098564389 +0000 UTC m=+1081.356103943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.385242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" event={"ID":"d31610db-32c1-4c99-9001-ab4504649a75","Type":"ContainerStarted","Data":"26556b50c7ff457d0b0f5b1bf419e89f769e57ef9cdf44f9ecb57d188af3cf1e"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.393239 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.395154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" event={"ID":"f777df4b-1040-4f86-a816-ea778b9e5dc3","Type":"ContainerStarted","Data":"5006b2714949060f6141e8bf358bd04edd9af86f59f2c746c148ad75f86a8685"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.396521 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.396694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" event={"ID":"057d4c8d-606e-44ea-89ea-fb17b4d63733","Type":"ContainerStarted","Data":"54e700b978a1894bf048057301bd7f5c2c25d78229d46ec352288697447febc9"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.399279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" event={"ID":"fe25346c-5f31-478e-a639-060c5958b1eb","Type":"ContainerStarted","Data":"58b060d9a69f405cfd8981fcb83cce5701eb693e39ed61f0498941eafbd317ef"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.407755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" event={"ID":"7d5e1e13-5ce4-48ba-a8c9-3db924e63840","Type":"ContainerStarted","Data":"3588b8ee5c71d52bec6cd15f9541e3423f20f5b62a21d1be12c78bc9f096d81c"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.409360 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.413776 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" event={"ID":"513da4ed-be63-45dd-a32a-27ac3ef443a5","Type":"ContainerStarted","Data":"54a54e61d3fe738352352c66c40b4b668766e1c1757771f9ad158b183e5ee63e"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.417012 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" event={"ID":"11361a5e-18c5-448a-8b07-8f5e3245f607","Type":"ContainerStarted","Data":"4d0ecadaddf7458069bcc13ce6c840b2309e0e3d7c02c93140ac8ec476108dc5"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.418166 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.419280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" event={"ID":"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6","Type":"ContainerStarted","Data":"060e8ffe2106a86b9f842715e1a1d726e3f240b1b652c3577e7be28b4e5e1287"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.422310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" event={"ID":"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad","Type":"ContainerStarted","Data":"45d3a51fbefc905a529c4fc48673fc6eb0e6d525982fe5d486ec663988b95812"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.429521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" event={"ID":"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1","Type":"ContainerStarted","Data":"81a298accb6bbe5272d844b50c60ceb3883903dce2e219da7f617f392d9406d9"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.430939 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.433088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" event={"ID":"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70","Type":"ContainerStarted","Data":"ce1a0348bf72abb9487d943efd38681162d4fa08b6be47c16c9c3662cc3b2c28"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.435139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" event={"ID":"cf1fe81a-282d-4e51-b8d9-d6569a640985","Type":"ContainerStarted","Data":"95499e972581b730f0cea49768aa004bf09268400286562361694f99b1cbf4c8"} Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.435776 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.438378 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.440203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" event={"ID":"66c995b3-f763-455e-8ea3-7dfdfb4c4301","Type":"ContainerStarted","Data":"edd12e3c2539dadd2f78b86bd4c3fcccd1dbd25f0c6559732f8f5ade5b6c1b27"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.441607 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" event={"ID":"fae0f5f8-e721-4ef1-9c8f-4574f156913f","Type":"ContainerStarted","Data":"f773426c28e98459761fb10a984e36ed348487207e7a1cf937002fafbc60f04e"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.442915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" event={"ID":"3747ddf8-799c-441c-bd9d-4450bdb72382","Type":"ContainerStarted","Data":"67b2717335587eb39bb913c5d5a6459f2ea1b9d9449ad76347a23f67f3753779"} Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.507348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:30 crc kubenswrapper[4781]: I0227 00:23:30.507494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.508657 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.508697 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.508682628 +0000 UTC m=+1081.766222182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.509679 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:30 crc kubenswrapper[4781]: E0227 00:23:30.509711 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:32.509701554 +0000 UTC m=+1081.767241108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480001 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podUID="7d5e1e13-5ce4-48ba-a8c9-3db924e63840" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480130 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podUID="11361a5e-18c5-448a-8b07-8f5e3245f607" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480158 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480176 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480210 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.480216 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:23:31 crc kubenswrapper[4781]: I0227 00:23:31.634218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.635920 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:31 crc kubenswrapper[4781]: E0227 00:23:31.635963 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:35.635949959 +0000 UTC m=+1084.893489513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.150147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.150360 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.150456 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.150435795 +0000 UTC m=+1085.407975349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.555254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555445 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: I0227 00:23:32.555701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555739 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.555709716 +0000 UTC m=+1085.813249270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555835 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:32 crc kubenswrapper[4781]: E0227 00:23:32.555891 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:36.55587564 +0000 UTC m=+1085.813415194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:35 crc kubenswrapper[4781]: I0227 00:23:35.708450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:35 crc kubenswrapper[4781]: E0227 00:23:35.708694 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:35 crc kubenswrapper[4781]: E0227 00:23:35.708975 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert podName:771a50fd-33f6-47ba-ac4a-46da5446cdd8 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:43.708954236 +0000 UTC m=+1092.966493790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert") pod "infra-operator-controller-manager-79d975b745-vhmbb" (UID: "771a50fd-33f6-47ba-ac4a-46da5446cdd8") : secret "infra-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.216934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.217113 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.217223 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert podName:83466be2-d230-4516-b594-ee56aae3c510 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.217199498 +0000 UTC m=+1093.474739062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" (UID: "83466be2-d230-4516-b594-ee56aae3c510") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.626544 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:36 crc kubenswrapper[4781]: I0227 00:23:36.626738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626824 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626955 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.626966 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.626927577 +0000 UTC m=+1093.884467171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:36 crc kubenswrapper[4781]: E0227 00:23:36.627019 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:23:44.626996308 +0000 UTC m=+1093.884535952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "metrics-server-cert" not found Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.572332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" event={"ID":"fae0f5f8-e721-4ef1-9c8f-4574f156913f","Type":"ContainerStarted","Data":"ebab2794d9e030be0865957878bdaa84f8b2f279a1def3bd5ca3f62fdc716e9a"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.572998 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.583014 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" event={"ID":"c1807c06-6c68-477c-8725-5702e2d59c93","Type":"ContainerStarted","Data":"e1e76b1c9f0d463e5f7ab46694ae8e019f9fea48aa6d07e7b6d4666c0655a794"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.583113 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.587981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" event={"ID":"3747ddf8-799c-441c-bd9d-4450bdb72382","Type":"ContainerStarted","Data":"299a1389dae53d1431a0c0bd7ebf880145d792dbe52ed0c2b39dfb14c873121d"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.588178 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.598237 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" event={"ID":"a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad","Type":"ContainerStarted","Data":"3788d7780bf568a2713303959311410200b328c2042d2cd38c7d8f3aba1e0a19"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.598540 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.607208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" event={"ID":"e4d59c4e-1fd2-43d9-8ac2-d162e746e758","Type":"ContainerStarted","Data":"6fd7dae2aa3f05d8711ac2cda15fffd04c282e92e56ccc013ae6834e00bd1081"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.607346 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.613976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" event={"ID":"66c995b3-f763-455e-8ea3-7dfdfb4c4301","Type":"ContainerStarted","Data":"8ac8efa04cc24746772ee35fea845a9d140de729d58d693bb10c72e413c876e1"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.614067 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.617764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" event={"ID":"6739bbb3-bf62-4b1d-8dd7-3accde691e66","Type":"ContainerStarted","Data":"18dd82aa88f3d058ab7fc03fcc9687487f0bd26f7241f7339aac4aa4c409161b"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.618130 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.627936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" event={"ID":"fe1f6a92-751f-417e-b2ff-694c10210db7","Type":"ContainerStarted","Data":"0b08e8fd35e96179fd376eaef5cebaa658116cfeb02226b8d7ecd8198e0b5eb3"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.629005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.645016 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" event={"ID":"bd77d7fe-85fb-4b16-aa12-75359b52e139","Type":"ContainerStarted","Data":"f5c3c43efc443ab99410325394c2ef43279dae4e8ca4f6328f5acecf3c7873e4"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.645485 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.662182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" event={"ID":"057d4c8d-606e-44ea-89ea-fb17b4d63733","Type":"ContainerStarted","Data":"dd20b9bef99a25d947b9ff22e14ccb489351f961d60c15a816f7d746fdeca5b5"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.662759 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.677581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" event={"ID":"fe25346c-5f31-478e-a639-060c5958b1eb","Type":"ContainerStarted","Data":"7f7af3e6ab0d27e04e135918941e8b4fdd6816db2486a20d90b12222cb813ba8"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.677992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.687356 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" podStartSLOduration=2.7920212429999998 podStartE2EDuration="14.687337475s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.761292575 +0000 UTC m=+1079.018832129" lastFinishedPulling="2026-02-27 00:23:41.656608807 +0000 UTC m=+1090.914148361" observedRunningTime="2026-02-27 00:23:42.63933317 +0000 UTC m=+1091.896872724" watchObservedRunningTime="2026-02-27 00:23:42.687337475 +0000 UTC m=+1091.944877029" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.688372 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" podStartSLOduration=3.5054701440000002 podStartE2EDuration="15.688365632s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442809991 +0000 UTC m=+1078.700349545" lastFinishedPulling="2026-02-27 00:23:41.625705479 +0000 UTC m=+1090.883245033" observedRunningTime="2026-02-27 00:23:42.687809807 +0000 UTC m=+1091.945349361" watchObservedRunningTime="2026-02-27 00:23:42.688365632 +0000 UTC m=+1091.945905186" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.692275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" event={"ID":"513da4ed-be63-45dd-a32a-27ac3ef443a5","Type":"ContainerStarted","Data":"5dd297adcf1fc3e0814087838651a7c1568abfd0396891b04947722d4e83c15e"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.693005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.707289 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" event={"ID":"e9a3b900-688c-4043-b1ff-53ae1c3ee1d6","Type":"ContainerStarted","Data":"f0c143f8e1db419222ec838d37b6754fd18b71bbe9ee5893a54f0d84c18d707e"} Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.708353 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.758812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" podStartSLOduration=3.458429424 podStartE2EDuration="15.758793152s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.304579348 +0000 UTC m=+1078.562118902" lastFinishedPulling="2026-02-27 00:23:41.604943066 +0000 UTC m=+1090.862482630" observedRunningTime="2026-02-27 00:23:42.723554551 +0000 UTC m=+1091.981094095" watchObservedRunningTime="2026-02-27 00:23:42.758793152 +0000 UTC m=+1092.016332696" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.761284 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" podStartSLOduration=3.762131421 podStartE2EDuration="15.761264307s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.657397819 +0000 UTC m=+1078.914937373" lastFinishedPulling="2026-02-27 00:23:41.656530705 +0000 UTC m=+1090.914070259" observedRunningTime="2026-02-27 00:23:42.758238318 +0000 UTC m=+1092.015777872" watchObservedRunningTime="2026-02-27 00:23:42.761264307 +0000 UTC m=+1092.018803861" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.789037 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" podStartSLOduration=3.271942861 podStartE2EDuration="15.789021852s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.139443103 +0000 UTC m=+1078.396982647" lastFinishedPulling="2026-02-27 00:23:41.656522084 +0000 UTC m=+1090.914061638" observedRunningTime="2026-02-27 00:23:42.783919889 +0000 UTC m=+1092.041459463" watchObservedRunningTime="2026-02-27 00:23:42.789021852 +0000 UTC m=+1092.046561416" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.807258 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" podStartSLOduration=3.290411712 podStartE2EDuration="15.807240888s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.140317785 +0000 UTC m=+1078.397857339" lastFinishedPulling="2026-02-27 00:23:41.657146961 +0000 UTC m=+1090.914686515" observedRunningTime="2026-02-27 00:23:42.804359503 +0000 UTC m=+1092.061899057" watchObservedRunningTime="2026-02-27 00:23:42.807240888 +0000 UTC m=+1092.064780442" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.835530 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" podStartSLOduration=3.376320128 podStartE2EDuration="15.835516057s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.138911479 +0000 UTC m=+1078.396451033" lastFinishedPulling="2026-02-27 00:23:41.598107408 +0000 UTC m=+1090.855646962" observedRunningTime="2026-02-27 00:23:42.832447147 +0000 UTC m=+1092.089986701" watchObservedRunningTime="2026-02-27 00:23:42.835516057 +0000 UTC m=+1092.093055611" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.852643 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" podStartSLOduration=2.834999656 podStartE2EDuration="14.852602854s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.638128556 +0000 UTC m=+1078.895668110" lastFinishedPulling="2026-02-27 00:23:41.655731744 +0000 UTC m=+1090.913271308" observedRunningTime="2026-02-27 00:23:42.849482763 +0000 UTC m=+1092.107022317" watchObservedRunningTime="2026-02-27 00:23:42.852602854 +0000 UTC m=+1092.110142418" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.865978 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" podStartSLOduration=3.637192606 podStartE2EDuration="15.865961363s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.444329831 +0000 UTC m=+1078.701869385" lastFinishedPulling="2026-02-27 00:23:41.673098588 +0000 UTC m=+1090.930638142" observedRunningTime="2026-02-27 00:23:42.864779142 +0000 UTC m=+1092.122318696" watchObservedRunningTime="2026-02-27 00:23:42.865961363 +0000 UTC m=+1092.123500917" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.886376 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" podStartSLOduration=3.720034542 podStartE2EDuration="15.886355676s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442676758 +0000 UTC m=+1078.700216312" lastFinishedPulling="2026-02-27 00:23:41.608997892 +0000 UTC m=+1090.866537446" observedRunningTime="2026-02-27 00:23:42.884440006 +0000 UTC m=+1092.141979570" watchObservedRunningTime="2026-02-27 00:23:42.886355676 +0000 UTC m=+1092.143895240" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.898939 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.899008 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.919840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" podStartSLOduration=3.640892103 podStartE2EDuration="15.919822591s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.325818683 +0000 UTC m=+1078.583358237" lastFinishedPulling="2026-02-27 00:23:41.604749171 +0000 UTC m=+1090.862288725" observedRunningTime="2026-02-27 00:23:42.914013039 +0000 UTC m=+1092.171552583" watchObservedRunningTime="2026-02-27 00:23:42.919822591 +0000 UTC m=+1092.177362145" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.939930 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" podStartSLOduration=3.7783667359999997 podStartE2EDuration="15.939914826s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.443246293 +0000 UTC m=+1078.700785837" lastFinishedPulling="2026-02-27 00:23:41.604794373 +0000 UTC m=+1090.862333927" observedRunningTime="2026-02-27 00:23:42.937593885 +0000 UTC m=+1092.195133439" watchObservedRunningTime="2026-02-27 00:23:42.939914826 +0000 UTC m=+1092.197454380" Feb 27 00:23:42 crc kubenswrapper[4781]: I0227 00:23:42.958474 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" podStartSLOduration=3.678044174 podStartE2EDuration="15.958455921s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.442708508 +0000 UTC m=+1078.700248062" lastFinishedPulling="2026-02-27 00:23:41.723120255 +0000 UTC m=+1090.980659809" observedRunningTime="2026-02-27 00:23:42.955530834 +0000 UTC m=+1092.213070378" watchObservedRunningTime="2026-02-27 00:23:42.958455921 +0000 UTC m=+1092.215995465" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.754198 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.760268 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a50fd-33f6-47ba-ac4a-46da5446cdd8-cert\") pod \"infra-operator-controller-manager-79d975b745-vhmbb\" (UID: \"771a50fd-33f6-47ba-ac4a-46da5446cdd8\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.771412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xzphw" Feb 27 00:23:43 crc kubenswrapper[4781]: I0227 00:23:43.780384 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.272280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.282428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83466be2-d230-4516-b594-ee56aae3c510-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4\" (UID: \"83466be2-d230-4516-b594-ee56aae3c510\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.290860 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb"] Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.293665 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-blcnh" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.302348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.679491 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.679815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:44 crc kubenswrapper[4781]: E0227 00:23:44.680043 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 00:23:44 crc kubenswrapper[4781]: E0227 00:23:44.680122 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs podName:9fe881c2-cb59-41ce-a23c-f2dcba86d9c3 nodeName:}" failed. No retries permitted until 2026-02-27 00:24:00.680103075 +0000 UTC m=+1109.937642629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs") pod "openstack-operator-controller-manager-7677fd857d-kxknf" (UID: "9fe881c2-cb59-41ce-a23c-f2dcba86d9c3") : secret "webhook-server-cert" not found Feb 27 00:23:44 crc kubenswrapper[4781]: I0227 00:23:44.696813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-metrics-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:23:45 crc kubenswrapper[4781]: W0227 00:23:45.662846 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771a50fd_33f6_47ba_ac4a_46da5446cdd8.slice/crio-579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c WatchSource:0}: Error finding container 579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c: Status 404 returned error can't find the container with id 579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c Feb 27 00:23:45 crc kubenswrapper[4781]: I0227 00:23:45.734998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" event={"ID":"771a50fd-33f6-47ba-ac4a-46da5446cdd8","Type":"ContainerStarted","Data":"579e138aef64f73a46ab6e02bc9c9a2614b72c787b7df35236b5d305c9049a8c"} Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.050029 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rfwpm" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.066195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-fb2wf" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.078211 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-rn44b" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.126318 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-nfzvw" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.158820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fmbwz" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.217259 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-szs2w" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.379823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-2pgf6" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.414253 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-4gl88" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.506353 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-jnhdb" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.545914 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-v5hwb" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.566238 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-trb7t" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.801900 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-bvdd5" Feb 27 00:23:48 crc kubenswrapper[4781]: I0227 00:23:48.896185 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5mgl8" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.149549 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.151338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.153792 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.153805 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.154493 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.157605 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.243683 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.259768 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4"] Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.345653 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.345854 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fp4z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-w5wp5_openstack-operators(f777df4b-1040-4f86-a816-ea778b9e5dc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.345954 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: E0227 00:24:00.346946 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.369187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"auto-csr-approver-29535864-cfd4d\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.479249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.754475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.761549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9fe881c2-cb59-41ce-a23c-f2dcba86d9c3-webhook-certs\") pod \"openstack-operator-controller-manager-7677fd857d-kxknf\" (UID: \"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3\") " pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.894111 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j74ff" Feb 27 00:24:00 crc kubenswrapper[4781]: I0227 00:24:00.902315 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.616424 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.616754 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4rvr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-rn2vt_openstack-operators(9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:01 crc kubenswrapper[4781]: E0227 00:24:01.618076 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.011242 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.011441 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhd9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7gr7g_openstack-operators(6d15395c-5ed9-43c8-b7f6-ac16e6e32e70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.012618 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.468285 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.468741 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfl55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-gs62l_openstack-operators(d31610db-32c1-4c99-9001-ab4504649a75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.470969 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.853940 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.854127 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpr5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-79d975b745-vhmbb_openstack-operators(771a50fd-33f6-47ba-ac4a-46da5446cdd8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.855269 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podUID="771a50fd-33f6-47ba-ac4a-46da5446cdd8" Feb 27 00:24:02 crc kubenswrapper[4781]: I0227 00:24:02.880374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" event={"ID":"83466be2-d230-4516-b594-ee56aae3c510","Type":"ContainerStarted","Data":"c7c04ffaf4b66c6d34f22072c332cd7fa571642346c94a35e0c138e7e21df50e"} Feb 27 00:24:02 crc kubenswrapper[4781]: E0227 00:24:02.881906 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:aef5ea3dc1d4f5b63416ee1cc12d0360a64229bb3fb954be3dd85eec8f4ae62a\\\"\"" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podUID="771a50fd-33f6-47ba-ac4a-46da5446cdd8" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.989235 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.989455 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9s8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-dc7k2_openstack-operators(cf1fe81a-282d-4e51-b8d9-d6569a640985): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:24:04 crc kubenswrapper[4781]: E0227 00:24:04.990611 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.524253 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:24:05 crc kubenswrapper[4781]: W0227 00:24:05.533875 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9402a6e_66bb_4e1e_a33f_7fce411c83b8.slice/crio-0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f WatchSource:0}: Error finding container 0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f: Status 404 returned error can't find the container with id 0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.547404 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf"] Feb 27 00:24:05 crc kubenswrapper[4781]: W0227 00:24:05.555477 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe881c2_cb59_41ce_a23c_f2dcba86d9c3.slice/crio-efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82 WatchSource:0}: Error finding container efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82: Status 404 returned error can't find the container with id efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82 Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.904616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" event={"ID":"11361a5e-18c5-448a-8b07-8f5e3245f607","Type":"ContainerStarted","Data":"01161ee19b4dff0a1f69598e67b45b7ac1a1e034e9d63077384c03bbecd1a305"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.905129 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.905402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerStarted","Data":"0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.906478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" event={"ID":"7d5e1e13-5ce4-48ba-a8c9-3db924e63840","Type":"ContainerStarted","Data":"c9bc986b5b89efcaaf41630bd3949d7ebda64929ff329299873b3f49e8e68663"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.906668 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.907937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" event={"ID":"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3","Type":"ContainerStarted","Data":"bbf00a8c2adc7ae793c27c40c2d41247cf3282ba1fb4da8744e6379a6fd1a1b2"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.907973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" event={"ID":"9fe881c2-cb59-41ce-a23c-f2dcba86d9c3","Type":"ContainerStarted","Data":"efb37c89ced21736b4124b5007f93df1cbca5667cfa95469450f490b000d3e82"} Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.908488 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.921295 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" podStartSLOduration=4.857749066 podStartE2EDuration="37.921280948s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.780468296 +0000 UTC m=+1079.038007850" lastFinishedPulling="2026-02-27 00:24:02.844000178 +0000 UTC m=+1112.101539732" observedRunningTime="2026-02-27 00:24:05.91981077 +0000 UTC m=+1115.177350354" watchObservedRunningTime="2026-02-27 00:24:05.921280948 +0000 UTC m=+1115.178820502" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.953922 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" podStartSLOduration=37.953906434 podStartE2EDuration="37.953906434s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:24:05.953700889 +0000 UTC m=+1115.211240463" watchObservedRunningTime="2026-02-27 00:24:05.953906434 +0000 UTC m=+1115.211445998" Feb 27 00:24:05 crc kubenswrapper[4781]: I0227 00:24:05.971162 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" podStartSLOduration=5.170375863 podStartE2EDuration="37.971145127s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.661658031 +0000 UTC m=+1078.919197585" lastFinishedPulling="2026-02-27 00:24:02.462427295 +0000 UTC m=+1111.719966849" observedRunningTime="2026-02-27 00:24:05.969177105 +0000 UTC m=+1115.226716669" watchObservedRunningTime="2026-02-27 00:24:05.971145127 +0000 UTC m=+1115.228684681" Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.915578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" event={"ID":"83466be2-d230-4516-b594-ee56aae3c510","Type":"ContainerStarted","Data":"c09933a142395c89175c8f4b09ce1001f94adf8018377eefdcc29796a34dffef"} Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.916327 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:24:06 crc kubenswrapper[4781]: I0227 00:24:06.954019 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" podStartSLOduration=34.874077018 podStartE2EDuration="38.953996998s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:24:02.466944344 +0000 UTC m=+1111.724483898" lastFinishedPulling="2026-02-27 00:24:06.546864324 +0000 UTC m=+1115.804403878" observedRunningTime="2026-02-27 00:24:06.945035393 +0000 UTC m=+1116.202574947" watchObservedRunningTime="2026-02-27 00:24:06.953996998 +0000 UTC m=+1116.211536552" Feb 27 00:24:07 crc kubenswrapper[4781]: I0227 00:24:07.923086 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerID="e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8" exitCode=0 Feb 27 00:24:07 crc kubenswrapper[4781]: I0227 00:24:07.923140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerDied","Data":"e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8"} Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.265936 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.372619 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") pod \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\" (UID: \"b9402a6e-66bb-4e1e-a33f-7fce411c83b8\") " Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.377929 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm" (OuterVolumeSpecName: "kube-api-access-gfqjm") pod "b9402a6e-66bb-4e1e-a33f-7fce411c83b8" (UID: "b9402a6e-66bb-4e1e-a33f-7fce411c83b8"). InnerVolumeSpecName "kube-api-access-gfqjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.476364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqjm\" (UniqueName: \"kubernetes.io/projected/b9402a6e-66bb-4e1e-a33f-7fce411c83b8-kube-api-access-gfqjm\") on node \"crc\" DevicePath \"\"" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" event={"ID":"b9402a6e-66bb-4e1e-a33f-7fce411c83b8","Type":"ContainerDied","Data":"0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f"} Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942351 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f2effad7d2f9b3a8b364986d3b2496800347b3242d554451b7d4c789942439f" Feb 27 00:24:09 crc kubenswrapper[4781]: I0227 00:24:09.942759 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535864-cfd4d" Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.330278 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.335614 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535858-9fs8d"] Feb 27 00:24:10 crc kubenswrapper[4781]: I0227 00:24:10.910983 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7677fd857d-kxknf" Feb 27 00:24:11 crc kubenswrapper[4781]: I0227 00:24:11.317257 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb1e1bd-28ea-42f4-96d5-534db2674e68" path="/var/lib/kubelet/pods/3bb1e1bd-28ea-42f4-96d5-534db2674e68/volumes" Feb 27 00:24:12 crc kubenswrapper[4781]: E0227 00:24:12.310848 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podUID="9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1" Feb 27 00:24:12 crc kubenswrapper[4781]: E0227 00:24:12.310868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podUID="f777df4b-1040-4f86-a816-ea778b9e5dc3" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895462 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895536 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.895585 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.896258 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:24:12 crc kubenswrapper[4781]: I0227 00:24:12.896318 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" gracePeriod=600 Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981024 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" exitCode=0 Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9"} Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981641 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} Feb 27 00:24:13 crc kubenswrapper[4781]: I0227 00:24:13.981669 4781 scope.go:117] "RemoveContainer" containerID="4a4838ae34a31bed19fe04c8cb77eb7ca161a34e4d168445bf5a5f93e91a959a" Feb 27 00:24:14 crc kubenswrapper[4781]: E0227 00:24:14.311160 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podUID="d31610db-32c1-4c99-9001-ab4504649a75" Feb 27 00:24:14 crc kubenswrapper[4781]: I0227 00:24:14.315974 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4" Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.019172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" event={"ID":"771a50fd-33f6-47ba-ac4a-46da5446cdd8","Type":"ContainerStarted","Data":"3055237bd72add8225f237f142360ed0c7c8f63834d929d7742ed234922fc4a2"} Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.020194 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:24:16 crc kubenswrapper[4781]: I0227 00:24:16.045018 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" podStartSLOduration=19.004881071 podStartE2EDuration="49.044984101s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:45.690328487 +0000 UTC m=+1094.947868041" lastFinishedPulling="2026-02-27 00:24:15.730431507 +0000 UTC m=+1124.987971071" observedRunningTime="2026-02-27 00:24:16.041256763 +0000 UTC m=+1125.298796327" watchObservedRunningTime="2026-02-27 00:24:16.044984101 +0000 UTC m=+1125.302523695" Feb 27 00:24:16 crc kubenswrapper[4781]: E0227 00:24:16.309665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podUID="6d15395c-5ed9-43c8-b7f6-ac16e6e32e70" Feb 27 00:24:17 crc kubenswrapper[4781]: E0227 00:24:17.310926 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podUID="cf1fe81a-282d-4e51-b8d9-d6569a640985" Feb 27 00:24:18 crc kubenswrapper[4781]: I0227 00:24:18.666533 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tb298" Feb 27 00:24:19 crc kubenswrapper[4781]: I0227 00:24:19.024422 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-9d678b567-gttml" Feb 27 00:24:23 crc kubenswrapper[4781]: I0227 00:24:23.789720 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-vhmbb" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.128418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" event={"ID":"9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1","Type":"ContainerStarted","Data":"0fed33bc277b70314e993e7d612c8ba4d799cf63ea7db6799188b04fcbb1701e"} Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.129220 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.129925 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" event={"ID":"f777df4b-1040-4f86-a816-ea778b9e5dc3","Type":"ContainerStarted","Data":"aa9b6f371f5ef215ee04c8b30092785402f0b962e9163bc2f7a62ef89909297b"} Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.130210 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.143072 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" podStartSLOduration=2.08160666 podStartE2EDuration="59.143054912s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.672453463 +0000 UTC m=+1078.929993017" lastFinishedPulling="2026-02-27 00:24:26.733901695 +0000 UTC m=+1135.991441269" observedRunningTime="2026-02-27 00:24:27.14107643 +0000 UTC m=+1136.398615994" watchObservedRunningTime="2026-02-27 00:24:27.143054912 +0000 UTC m=+1136.400594466" Feb 27 00:24:27 crc kubenswrapper[4781]: I0227 00:24:27.156043 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" podStartSLOduration=3.04253977 podStartE2EDuration="1m0.156026122s" podCreationTimestamp="2026-02-27 00:23:27 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.701551613 +0000 UTC m=+1078.959091177" lastFinishedPulling="2026-02-27 00:24:26.815037955 +0000 UTC m=+1136.072577529" observedRunningTime="2026-02-27 00:24:27.153336922 +0000 UTC m=+1136.410876496" watchObservedRunningTime="2026-02-27 00:24:27.156026122 +0000 UTC m=+1136.413565676" Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.154896 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" event={"ID":"d31610db-32c1-4c99-9001-ab4504649a75","Type":"ContainerStarted","Data":"82eaab020c799642a67b8d3e650d83ca83cf1f9b9f741b6c7d332936321fa8dd"} Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.155659 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:24:30 crc kubenswrapper[4781]: I0227 00:24:30.173689 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" podStartSLOduration=2.108756161 podStartE2EDuration="1m2.173668918s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.776885842 +0000 UTC m=+1079.034425396" lastFinishedPulling="2026-02-27 00:24:29.841798559 +0000 UTC m=+1139.099338153" observedRunningTime="2026-02-27 00:24:30.1729923 +0000 UTC m=+1139.430531874" watchObservedRunningTime="2026-02-27 00:24:30.173668918 +0000 UTC m=+1139.431208482" Feb 27 00:24:31 crc kubenswrapper[4781]: I0227 00:24:31.162099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" event={"ID":"6d15395c-5ed9-43c8-b7f6-ac16e6e32e70","Type":"ContainerStarted","Data":"e0c1fd5fb38c3aef7c1dd3b851822a2247022a8fd97f150917ae99dcd6622583"} Feb 27 00:24:31 crc kubenswrapper[4781]: I0227 00:24:31.179727 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7gr7g" podStartSLOduration=2.266874279 podStartE2EDuration="1m3.179712437s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.788088015 +0000 UTC m=+1079.045627559" lastFinishedPulling="2026-02-27 00:24:30.700926153 +0000 UTC m=+1139.958465717" observedRunningTime="2026-02-27 00:24:31.17714217 +0000 UTC m=+1140.434681724" watchObservedRunningTime="2026-02-27 00:24:31.179712437 +0000 UTC m=+1140.437251991" Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.180017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" event={"ID":"cf1fe81a-282d-4e51-b8d9-d6569a640985","Type":"ContainerStarted","Data":"18882c8130987e0cf6ead5d8d1c23156356def627df7a15c4f5e2383dc0f395c"} Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.180716 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:24:33 crc kubenswrapper[4781]: I0227 00:24:33.197833 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" podStartSLOduration=2.132804801 podStartE2EDuration="1m5.197813755s" podCreationTimestamp="2026-02-27 00:23:28 +0000 UTC" firstStartedPulling="2026-02-27 00:23:29.782233832 +0000 UTC m=+1079.039773386" lastFinishedPulling="2026-02-27 00:24:32.847242786 +0000 UTC m=+1142.104782340" observedRunningTime="2026-02-27 00:24:33.194236511 +0000 UTC m=+1142.451776095" watchObservedRunningTime="2026-02-27 00:24:33.197813755 +0000 UTC m=+1142.455353319" Feb 27 00:24:38 crc kubenswrapper[4781]: I0227 00:24:38.526594 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-w5wp5" Feb 27 00:24:38 crc kubenswrapper[4781]: I0227 00:24:38.912558 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-rn2vt" Feb 27 00:24:39 crc kubenswrapper[4781]: I0227 00:24:39.045781 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-dc7k2" Feb 27 00:24:39 crc kubenswrapper[4781]: I0227 00:24:39.079195 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-gs62l" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.990142 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:54 crc kubenswrapper[4781]: E0227 00:24:54.990925 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.990938 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.991111 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" containerName="oc" Feb 27 00:24:54 crc kubenswrapper[4781]: I0227 00:24:54.992518 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011195 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011558 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.011706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.015873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qg6tr" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.026063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.085668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.085721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.133489 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.134598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.137358 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186298 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.186469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.187243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.197587 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.215772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"dnsmasq-dns-675f4bcbfc-vdbg9\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287473 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.287913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.288346 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.288606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.303145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"dnsmasq-dns-78dd6ddcc-6drvh\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.311195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.452176 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.783225 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:55 crc kubenswrapper[4781]: W0227 00:24:55.785872 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73aa093_0d39_41f9_a0bd_35e621c4cf8c.slice/crio-060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106 WatchSource:0}: Error finding container 060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106: Status 404 returned error can't find the container with id 060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106 Feb 27 00:24:55 crc kubenswrapper[4781]: I0227 00:24:55.890895 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:55 crc kubenswrapper[4781]: W0227 00:24:55.892295 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5411c548_900f_4d1e_816d_8687268b6ebc.slice/crio-705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf WatchSource:0}: Error finding container 705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf: Status 404 returned error can't find the container with id 705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf Feb 27 00:24:56 crc kubenswrapper[4781]: I0227 00:24:56.385210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" event={"ID":"c73aa093-0d39-41f9-a0bd-35e621c4cf8c","Type":"ContainerStarted","Data":"060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106"} Feb 27 00:24:56 crc kubenswrapper[4781]: I0227 00:24:56.395973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" event={"ID":"5411c548-900f-4d1e-816d-8687268b6ebc","Type":"ContainerStarted","Data":"705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf"} Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.808240 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.841998 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.843179 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.858436 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940343 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940406 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:57 crc kubenswrapper[4781]: I0227 00:24:57.940436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.042423 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.043428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.043951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.070394 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"dnsmasq-dns-666b6646f7-86z46\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.169973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.210786 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.271377 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.273474 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.280999 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354825 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.354870 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.455982 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.456939 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.457253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.480174 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"dnsmasq-dns-57d769cc4f-f9h55\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.639746 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:24:58 crc kubenswrapper[4781]: I0227 00:24:58.722138 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:24:58 crc kubenswrapper[4781]: W0227 00:24:58.732843 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12142f3c_5849_4af1_8c9e_c92304d3c375.slice/crio-719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2 WatchSource:0}: Error finding container 719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2: Status 404 returned error can't find the container with id 719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2 Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.022422 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.024145 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.027756 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jcfdg" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032351 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032565 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032644 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.032785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.033822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.034793 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.069984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070149 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070201 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.070545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.100760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178269 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178342 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178891 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.178939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179229 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.179937 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.183404 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.183574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.184005 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.192002 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.197754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.197779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.198518 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.198543 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a1416593fd912ec74c6e12871251980e537685bd157bf8eba211fce64d9b048a/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.199949 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.200906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.206369 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.238724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.361133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.408456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.412309 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417063 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417091 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417237 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417420 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417549 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.417579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t6n8b" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.430786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.453667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerStarted","Data":"719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2"} Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.456750 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerStarted","Data":"893d8f643c3cbe02cd19b59bf3115d432c587df0f05ea410ba0d0253101d7031"} Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488309 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.488323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.588948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.588987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589053 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589127 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.589257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590013 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590121 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.590879 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.591306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.592094 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.593235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.594199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.596128 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.597037 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.597069 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2eaf337fb87b6a71958dbd52c87dbf5c448ea95938dfd82cb1cc22a9e40efc9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.600130 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.615981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.642412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.757077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:24:59 crc kubenswrapper[4781]: I0227 00:24:59.959321 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.436777 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.439042 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.442348 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.443752 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mnp2q" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.443917 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.445878 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.451113 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.452844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508250 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508268 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.508399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.609639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.610111 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.610838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-kolla-config\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.611678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-config-data-default\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.612032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59d3864-af0d-407c-8431-ae2e17e4b46f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.624311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.629958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d3864-af0d-407c-8431-ae2e17e4b46f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.658998 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.659274 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af56ff13fee24b73e63022e5c2516390dc1057be7bc55daf5aed00165d3047c9/globalmount\"" pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.659086 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrjv2\" (UniqueName: \"kubernetes.io/projected/d59d3864-af0d-407c-8431-ae2e17e4b46f-kube-api-access-hrjv2\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:00 crc kubenswrapper[4781]: I0227 00:25:00.804261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6418f01a-d10c-4778-8bab-9c32fe35f8ae\") pod \"openstack-galera-0\" (UID: \"d59d3864-af0d-407c-8431-ae2e17e4b46f\") " pod="openstack/openstack-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.067583 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.914402 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.915591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.922911 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dn6l5" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.923901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.924590 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.925350 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938710 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938761 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.938787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:01 crc kubenswrapper[4781]: I0227 00:25:01.946190 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.040930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041090 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041208 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041240 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.041748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.042613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.043797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.053575 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.056026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22624edd-e366-4aff-84dd-c3cec89c0591-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.062872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22624edd-e366-4aff-84dd-c3cec89c0591-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065100 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj9x\" (UniqueName: \"kubernetes.io/projected/22624edd-e366-4aff-84dd-c3cec89c0591-kube-api-access-jgj9x\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065257 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.065284 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2626e085f5eebde1c78ff01a19f2918ba6aad6cd8b70016bc6cf3611ba49beaf/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.111836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83f1a113-e66d-4ace-84c6-db992b1c9591\") pod \"openstack-cell1-galera-0\" (UID: \"22624edd-e366-4aff-84dd-c3cec89c0591\") " pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.189310 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.190808 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.195421 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.195746 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9jjf8" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.196029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.202711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.243997 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244137 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.244308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345482 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.345511 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.347478 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kolla-config\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.349210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e98c4a-d812-4e42-b95c-d263e49bf5d3-config-data\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.351328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.363501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e98c4a-d812-4e42-b95c-d263e49bf5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.366487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvrr\" (UniqueName: \"kubernetes.io/projected/06e98c4a-d812-4e42-b95c-d263e49bf5d3-kube-api-access-mmvrr\") pod \"memcached-0\" (UID: \"06e98c4a-d812-4e42-b95c-d263e49bf5d3\") " pod="openstack/memcached-0" Feb 27 00:25:02 crc kubenswrapper[4781]: I0227 00:25:02.508533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.372081 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.373180 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.376234 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nmsfm" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.383975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.476970 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.564513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"f58e1ef93098c46c57b5e59fd849c5fcd9c3a1bc9f7c9d503b32be5e67364d02"} Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.578013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.617565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"kube-state-metrics-0\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " pod="openstack/kube-state-metrics-0" Feb 27 00:25:04 crc kubenswrapper[4781]: I0227 00:25:04.695479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.003134 4781 scope.go:117] "RemoveContainer" containerID="86bad95d795a7faf37cb19be6e8217786d2cabd57a047f7210f59250bf6bee2f" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.143486 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.145236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.148496 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.148772 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152710 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152739 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-77w49" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.152758 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.216690 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315145 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315212 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315294 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.315320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.416523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.416569 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417163 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417188 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.417556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.418000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.424386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/58009056-4183-4017-bfa1-c14ce28b92ea-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425064 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.425961 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.428558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/58009056-4183-4017-bfa1-c14ce28b92ea-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.458314 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blz5w\" (UniqueName: \"kubernetes.io/projected/58009056-4183-4017-bfa1-c14ce28b92ea-kube-api-access-blz5w\") pod \"alertmanager-metric-storage-0\" (UID: \"58009056-4183-4017-bfa1-c14ce28b92ea\") " pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.461176 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.764916 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.767002 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.769806 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.769884 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770066 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770109 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zmzb4" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770239 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770313 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.770067 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.795407 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926289 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926608 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926752 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.926990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:05 crc kubenswrapper[4781]: I0227 00:25:05.927024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029158 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029190 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029239 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.029285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030927 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030956 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.030998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034584 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.034746 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.036127 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.036181 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b26095f48a6799aae7472dc34ad76c7f8559a3fa84033df1f18203d2595242ed/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.053728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.072874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:06 crc kubenswrapper[4781]: I0227 00:25:06.094479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.958102 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.960242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.962494 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.962841 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rxqbb" Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.977259 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.987874 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:07 crc kubenswrapper[4781]: I0227 00:25:07.989062 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.030417 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.048609 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.060941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.060997 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.061135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162760 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162935 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162996 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.162998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-lib\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-etc-ovs\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-run\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.163237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-var-log\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.164695 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-scripts\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.193175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnx8\" (UniqueName: \"kubernetes.io/projected/9c2c498e-52b1-4ee2-bcf8-3599ee89513c-kube-api-access-vrnx8\") pod \"ovn-controller-ovs-hcb9s\" (UID: \"9c2c498e-52b1-4ee2-bcf8-3599ee89513c\") " pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.254473 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.255885 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.259776 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.259789 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m8pgd" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.260793 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264169 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-log-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run-ovn\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.264744 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/092921e0-a033-4021-b0f5-9c89de3aa830-var-run\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.266432 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/092921e0-a033-4021-b0f5-9c89de3aa830-scripts\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.269239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-combined-ca-bundle\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.271462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/092921e0-a033-4021-b0f5-9c89de3aa830-ovn-controller-tls-certs\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.274586 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.281473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.310097 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj29s\" (UniqueName: \"kubernetes.io/projected/092921e0-a033-4021-b0f5-9c89de3aa830-kube-api-access-rj29s\") pod \"ovn-controller-9zkpb\" (UID: \"092921e0-a033-4021-b0f5-9c89de3aa830\") " pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.350049 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366570 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366705 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.366747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468619 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468832 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.468859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.469687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.470983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd103c67-d035-4de1-aba9-667d1eb67813-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.473148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.474342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.476941 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.476969 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7d22a4830c7704a84644fe2448f23da2d221b5f43f3bfc24558389697e21a972/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.477336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.477473 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd103c67-d035-4de1-aba9-667d1eb67813-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.489558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtk4\" (UniqueName: \"kubernetes.io/projected/bd103c67-d035-4de1-aba9-667d1eb67813-kube-api-access-vdtk4\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.547512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1d1aeca-fc6d-46c3-b5d4-052d2fe646dc\") pod \"ovsdbserver-nb-0\" (UID: \"bd103c67-d035-4de1-aba9-667d1eb67813\") " pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:08 crc kubenswrapper[4781]: I0227 00:25:08.675379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.853338 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.856572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860585 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.860965 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4hx8w" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.861182 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:12 crc kubenswrapper[4781]: I0227 00:25:12.946664 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048347 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048371 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.048698 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.049572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.051361 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.052311 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.052347 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c97dcfd59732a972091dce3593a8b31fc374fcdbce6ba9daf533c75d52555044/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.054497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.057339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.059133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.068889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ccq\" (UniqueName: \"kubernetes.io/projected/7d499c77-ccba-41d1-9efb-8424fc7e8d0e-kube-api-access-s4ccq\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.090870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54fcded5-5d2b-4d43-a81f-13a4be11c64c\") pod \"ovsdbserver-sb-0\" (UID: \"7d499c77-ccba-41d1-9efb-8424fc7e8d0e\") " pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:13 crc kubenswrapper[4781]: I0227 00:25:13.173534 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.441049 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.446950 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486089 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486432 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486657 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-dnw9v" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.486859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.487409 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.493067 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604835 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604968 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.604994 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.605017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.618984 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.623251 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626349 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626681 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.626842 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.641275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.703428 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.704648 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708493 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708665 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708754 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.708802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.709664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.711074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5170e93-09e9-40d2-ac65-b87d44ceb185-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.720040 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.725540 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.732144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.734840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/a5170e93-09e9-40d2-ac65-b87d44ceb185-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.735109 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njkb6\" (UniqueName: \"kubernetes.io/projected/a5170e93-09e9-40d2-ac65-b87d44ceb185-kube-api-access-njkb6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-nqbgf\" (UID: \"a5170e93-09e9-40d2-ac65-b87d44ceb185\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.749786 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810167 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810513 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.810906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811007 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.811463 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.812237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.817013 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.822373 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.824458 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.827390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d71cee9c-2288-4843-ab71-0720c8527073-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.837836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtqr\" (UniqueName: \"kubernetes.io/projected/d71cee9c-2288-4843-ab71-0720c8527073-kube-api-access-rrtqr\") pod \"cloudkitty-lokistack-querier-58c84b5844-whkj4\" (UID: \"d71cee9c-2288-4843-ab71-0720c8527073\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.868314 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.869800 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878205 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878237 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878375 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878445 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878558 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878731 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.878834 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.880431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.884135 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-vrztz" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.884711 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.890005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.912990 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.913147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.913795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.914242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e3acc2-cee4-4bfe-af04-3a64041fc327-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.920547 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.931004 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d9e3acc2-cee4-4bfe-af04-3a64041fc327-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.931910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml94h\" (UniqueName: \"kubernetes.io/projected/d9e3acc2-cee4-4bfe-af04-3a64041fc327-kube-api-access-ml94h\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f\" (UID: \"d9e3acc2-cee4-4bfe-af04-3a64041fc327\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:15 crc kubenswrapper[4781]: I0227 00:25:15.943600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014849 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.014923 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015083 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015158 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015238 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.015446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.101743 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116355 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.116997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117084 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117103 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117126 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117163 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117178 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.117892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.118914 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119002 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.119122 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/233250c8-3871-43ec-8c1d-47bd1d3133e1-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.120817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.121015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.121537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.122180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.122490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.124967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/233250c8-3871-43ec-8c1d-47bd1d3133e1-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.137230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsph\" (UniqueName: \"kubernetes.io/projected/233250c8-3871-43ec-8c1d-47bd1d3133e1-kube-api-access-glsph\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-mj87x\" (UID: \"233250c8-3871-43ec-8c1d-47bd1d3133e1\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.138972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2qg\" (UniqueName: \"kubernetes.io/projected/877c39ec-0202-4987-b6e7-4fb90c4dc9b5-kube-api-access-sp2qg\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-bxttl\" (UID: \"877c39ec-0202-4987-b6e7-4fb90c4dc9b5\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.203806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.252616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.608137 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.610665 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.616290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.616570 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.628038 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.672694 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.676702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.680267 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.680421 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.687221 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726657 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726688 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726719 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.726942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.727162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.758838 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.760602 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.763270 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.765490 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.768718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829268 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829338 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829418 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829438 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829485 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829585 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829621 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829755 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.829869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2691e066-2f4c-4e7e-bcac-01933bd6cadb-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831133 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.831233 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.836951 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.837390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.837840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/2691e066-2f4c-4e7e-bcac-01933bd6cadb-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.848034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhw9\" (UniqueName: \"kubernetes.io/projected/2691e066-2f4c-4e7e-bcac-01933bd6cadb-kube-api-access-bhhw9\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.861869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.867320 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"2691e066-2f4c-4e7e-bcac-01933bd6cadb\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.930962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.931263 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.931897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932492 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932529 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932804 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.932961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.933001 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.933968 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.934017 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/684ccdab-ae41-466c-bf47-78c3ada41164-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.934196 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.937042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.939894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.943061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.945959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.949331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.950438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.950892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/684ccdab-ae41-466c-bf47-78c3ada41164-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.952671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/42503ae1-b143-45c3-8789-e2d1f72cc335-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.954917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn45d\" (UniqueName: \"kubernetes.io/projected/42503ae1-b143-45c3-8789-e2d1f72cc335-kube-api-access-dn45d\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.956881 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4hx\" (UniqueName: \"kubernetes.io/projected/684ccdab-ae41-466c-bf47-78c3ada41164-kube-api-access-rp4hx\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.967619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"42503ae1-b143-45c3-8789-e2d1f72cc335\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:16 crc kubenswrapper[4781]: I0227 00:25:16.970440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"684ccdab-ae41-466c-bf47-78c3ada41164\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:17 crc kubenswrapper[4781]: I0227 00:25:17.006580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:17 crc kubenswrapper[4781]: I0227 00:25:17.100364 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.408497 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.409142 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fltx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vdbg9_openstack(c73aa093-0d39-41f9-a0bd-35e621c4cf8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.410366 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" podUID="c73aa093-0d39-41f9-a0bd-35e621c4cf8c" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.488407 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.488567 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8z4t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6drvh_openstack(5411c548-900f-4d1e-816d-8687268b6ebc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.490272 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" podUID="5411c548-900f-4d1e-816d-8687268b6ebc" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.497065 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.497252 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gk5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-86z46_openstack(12142f3c-5849-4af1-8c9e-c92304d3c375): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.506331 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.506481 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fw2z9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-f9h55_openstack(0f2c76ec-cfab-4f18-b624-722021700885): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.508176 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.509688 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.785403 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" Feb 27 00:25:17 crc kubenswrapper[4781]: E0227 00:25:17.785665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.112784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.118808 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.452063 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.455973 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22624edd_e366_4aff_84dd_c3cec89c0591.slice/crio-fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4 WatchSource:0}: Error finding container fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4: Status 404 returned error can't find the container with id fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.462097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.471182 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571702 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571756 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") pod \"5411c548-900f-4d1e-816d-8687268b6ebc\" (UID: \"5411c548-900f-4d1e-816d-8687268b6ebc\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") pod \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.571832 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") pod \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\" (UID: \"c73aa093-0d39-41f9-a0bd-35e621c4cf8c\") " Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config" (OuterVolumeSpecName: "config") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572496 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config" (OuterVolumeSpecName: "config") pod "c73aa093-0d39-41f9-a0bd-35e621c4cf8c" (UID: "c73aa093-0d39-41f9-a0bd-35e621c4cf8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.572578 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573048 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573067 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5411c548-900f-4d1e-816d-8687268b6ebc-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.573077 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.576766 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx" (OuterVolumeSpecName: "kube-api-access-8fltx") pod "c73aa093-0d39-41f9-a0bd-35e621c4cf8c" (UID: "c73aa093-0d39-41f9-a0bd-35e621c4cf8c"). InnerVolumeSpecName "kube-api-access-8fltx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.578923 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2" (OuterVolumeSpecName: "kube-api-access-8z4t2") pod "5411c548-900f-4d1e-816d-8687268b6ebc" (UID: "5411c548-900f-4d1e-816d-8687268b6ebc"). InnerVolumeSpecName "kube-api-access-8z4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.664111 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ca2a9f_a42e_4d9b_89a7_f2590842f328.slice/crio-231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d WatchSource:0}: Error finding container 231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d: Status 404 returned error can't find the container with id 231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.674617 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.674978 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4t2\" (UniqueName: \"kubernetes.io/projected/5411c548-900f-4d1e-816d-8687268b6ebc-kube-api-access-8z4t2\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.675025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fltx\" (UniqueName: \"kubernetes.io/projected/c73aa093-0d39-41f9-a0bd-35e621c4cf8c-kube-api-access-8fltx\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.752922 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e98c4a-d812-4e42-b95c-d263e49bf5d3","Type":"ContainerStarted","Data":"c5464028f7513fcdae391f484a3a38c5d42febbb578693bd7eafd9d1eac440ff"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.754392 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.755483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"6b8f7a80085810149f097d16464405f2d3e453688708d9148af3e4e13a432507"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.756406 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" event={"ID":"c73aa093-0d39-41f9-a0bd-35e621c4cf8c","Type":"ContainerDied","Data":"060eb712a24d9ca196230b121a82b836df7e85841645aaea05dcf490cac6e106"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.756479 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vdbg9" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.763968 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.764019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6drvh" event={"ID":"5411c548-900f-4d1e-816d-8687268b6ebc","Type":"ContainerDied","Data":"705ae1ac3f4cc27e243e281ed7ae1b8b6a29f990c8825db10f1d6ca4d40585cf"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.772201 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"fc122a09a1ad822388f597d9278445e9c9b6bce2b48664dd2b6b7c6b7437eac4"} Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.838071 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.845191 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.851591 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.856088 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91997a3e_9e65_4eab_a0b9_8f9c639a8d05.slice/crio-40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446 WatchSource:0}: Error finding container 40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446: Status 404 returned error can't find the container with id 40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.859559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.866041 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58009056_4183_4017_bfa1_c14ce28b92ea.slice/crio-493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502 WatchSource:0}: Error finding container 493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502: Status 404 returned error can't find the container with id 493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502 Feb 27 00:25:18 crc kubenswrapper[4781]: I0227 00:25:18.995816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 00:25:18 crc kubenswrapper[4781]: W0227 00:25:18.998984 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd103c67_d035_4de1_aba9_667d1eb67813.slice/crio-3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243 WatchSource:0}: Error finding container 3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243: Status 404 returned error can't find the container with id 3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243 Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.130163 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.137270 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vdbg9"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.153889 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.160373 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6drvh"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.193943 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4"] Feb 27 00:25:19 crc kubenswrapper[4781]: W0227 00:25:19.205461 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71cee9c_2288_4843_ab71_0720c8527073.slice/crio-cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c WatchSource:0}: Error finding container cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c: Status 404 returned error can't find the container with id cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.322419 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5411c548-900f-4d1e-816d-8687268b6ebc" path="/var/lib/kubelet/pods/5411c548-900f-4d1e-816d-8687268b6ebc/volumes" Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.322835 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73aa093-0d39-41f9-a0bd-35e621c4cf8c" path="/var/lib/kubelet/pods/c73aa093-0d39-41f9-a0bd-35e621c4cf8c/volumes" Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.397783 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.410868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.421526 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf"] Feb 27 00:25:19 crc kubenswrapper[4781]: W0227 00:25:19.437215 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5170e93_09e9_40d2_ac65_b87d44ceb185.slice/crio-688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0 WatchSource:0}: Error finding container 688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0: Status 404 returned error can't find the container with id 688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0 Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.439908 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.482938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.503807 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.511365 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl"] Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.794017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"684ccdab-ae41-466c-bf47-78c3ada41164","Type":"ContainerStarted","Data":"c00c80bfccef04e1fc565a7c44c1a1deb9ddb678c0b87849935b93fed5452439"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.795535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.797823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" event={"ID":"d71cee9c-2288-4843-ab71-0720c8527073","Type":"ContainerStarted","Data":"cda482192b4e511d4ceb1a90ce18a6df940ec845cf4be5912926c4014eb0c77c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.799074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"2691e066-2f4c-4e7e-bcac-01933bd6cadb","Type":"ContainerStarted","Data":"6bafacaf9526831ff083e02e39024ee90f76bc912035296c37f6794c2d351566"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.800164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"493c28f23519740c1b7fbcc8f895aa01b0fc47933b8169023d997be516c2a502"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.801585 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"42503ae1-b143-45c3-8789-e2d1f72cc335","Type":"ContainerStarted","Data":"f3ca9f27b392ca0e92f0701d7b70193de66a094d153f08887bc1e331065b719f"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.802804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb" event={"ID":"092921e0-a033-4021-b0f5-9c89de3aa830","Type":"ContainerStarted","Data":"cebfceb5ae1ea44437777c92aec223fc929f53d637b67e35eade18f05d4d3c1c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.804040 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"3182ebf2704dccde26ab7abd6d192671846f9105b6a3740be2e207d2d08e2243"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.805187 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" event={"ID":"877c39ec-0202-4987-b6e7-4fb90c4dc9b5","Type":"ContainerStarted","Data":"33f26099ff4150504b0c6d4ea4dfaf4293bf0feaa48dbc817e863e2d58ba5f60"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.806818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" event={"ID":"233250c8-3871-43ec-8c1d-47bd1d3133e1","Type":"ContainerStarted","Data":"7b2d54735183971b72c72a7da74b00000e7d041b9f935da739b6b6ec312eac0c"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.808107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerStarted","Data":"40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.811407 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.814332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" event={"ID":"a5170e93-09e9-40d2-ac65-b87d44ceb185","Type":"ContainerStarted","Data":"688b627a224232d5a2d9839ff3427637e8527398cb09527e50a702a1895864c0"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.816785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" event={"ID":"d9e3acc2-cee4-4bfe-af04-3a64041fc327","Type":"ContainerStarted","Data":"86208fb43402d857d08f4221e080cb603cac9014d9ef8af1aacc6e1db6fc5d11"} Feb 27 00:25:19 crc kubenswrapper[4781]: I0227 00:25:19.910844 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hcb9s"] Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.234853 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.826242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91"} Feb 27 00:25:20 crc kubenswrapper[4781]: I0227 00:25:20.828491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"9140f071fe8e777af99e3a3ff97717072ad39af1f26192169806802a3ca79168"} Feb 27 00:25:21 crc kubenswrapper[4781]: I0227 00:25:21.837752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"d0ecf3d712f99c7845d7b07b68ef8e7492fbd51a59105297fe43602417b3296c"} Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.909129 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.909833 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn45d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(42503ae1-b143-45c3-8789-e2d1f72cc335): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:33 crc kubenswrapper[4781]: E0227 00:25:33.911057 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="42503ae1-b143-45c3-8789-e2d1f72cc335" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.088674 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.088911 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-query-frontend,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=query-frontend -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-query-frontend-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ml94h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f_openstack(d9e3acc2-cee4-4bfe-af04-3a64041fc327): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.090075 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podUID="d9e3acc2-cee4-4bfe-af04-3a64041fc327" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.280084 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="42503ae1-b143-45c3-8789-e2d1f72cc335" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.280164 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-query-frontend\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podUID="d9e3acc2-cee4-4bfe-af04-3a64041fc327" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.671155 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.671351 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sp2qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-bxttl_openstack(877c39ec-0202-4987-b6e7-4fb90c4dc9b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:34 crc kubenswrapper[4781]: E0227 00:25:34.672682 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podUID="877c39ec-0202-4987-b6e7-4fb90c4dc9b5" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.011699 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.011904 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6h65ch588h587h5c8h5b8h576hcch654h5dchb8h68h6bh65fh569h555hddh547h5hd8h5c4h68fh68ch687h99h5f9h5f9h587h57dhd4h5dbhd8q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdtk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(bd103c67-d035-4de1-aba9-667d1eb67813): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.232811 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.233089 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h64fh576h5bdh58ch667h546h664h5fh579h58bhbdh84hd7hb8h554h5c9h74h67ch8ch5cfh566h5b9hd8h89h58bh654h96h5c8h5f8h5d5h59fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rj29s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-9zkpb_openstack(092921e0-a033-4021-b0f5-9c89de3aa830): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.235169 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-9zkpb" podUID="092921e0-a033-4021-b0f5-9c89de3aa830" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.245739 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.245931 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glsph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-mj87x_openstack(233250c8-3871-43ec-8c1d-47bd1d3133e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.247138 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podUID="233250c8-3871-43ec-8c1d-47bd1d3133e1" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292798 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-9zkpb" podUID="092921e0-a033-4021-b0f5-9c89de3aa830" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292972 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podUID="877c39ec-0202-4987-b6e7-4fb90c4dc9b5" Feb 27 00:25:35 crc kubenswrapper[4781]: E0227 00:25:35.292983 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podUID="233250c8-3871-43ec-8c1d-47bd1d3133e1" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.340541 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.340885 4781 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.341006 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hr9c5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(91997a3e-9e65-4eab-a0b9-8f9c639a8d05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 00:25:36 crc kubenswrapper[4781]: E0227 00:25:36.342152 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" Feb 27 00:25:37 crc kubenswrapper[4781]: E0227 00:25:37.315863 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" Feb 27 00:25:38 crc kubenswrapper[4781]: E0227 00:25:38.041003 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.323939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.325888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.328307 4781 generic.go:334] "Generic (PLEG): container finished" podID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" exitCode=0 Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.328340 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.329704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" event={"ID":"a5170e93-09e9-40d2-ac65-b87d44ceb185","Type":"ContainerStarted","Data":"b77bd5c6b95e65b09b6b96f9bf95bd4d4fbe2391eda0dbf3e303eee73cece604"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.329824 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.334367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.336307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"237ed7dac6f99d58c104a037ad50883c6586975e333239c944c324f9de747ddb"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.338981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" event={"ID":"d71cee9c-2288-4843-ab71-0720c8527073","Type":"ContainerStarted","Data":"ec8b42000edf747a02b6a4cf8c4141b9c4d8a4ac9fcd17ee9c7836c6dfe6b2c1"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.339142 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.340672 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e98c4a-d812-4e42-b95c-d263e49bf5d3","Type":"ContainerStarted","Data":"754acbf64e09afaf10eb0f4a70a3d85f343afd74ee94f34a891dcbfb091a16b1"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.340803 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.343347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"2691e066-2f4c-4e7e-bcac-01933bd6cadb","Type":"ContainerStarted","Data":"a72d3ac6b8270c1de5ecd0dbf7eed61277df81168d435784e3cced07cdd620d9"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.343804 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.357391 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f2c76ec-cfab-4f18-b624-722021700885" containerID="2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac" exitCode=0 Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.357660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.360478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"684ccdab-ae41-466c-bf47-78c3ada41164","Type":"ContainerStarted","Data":"7b9134bf688881cccad49f93b644492fdb6b4260e2b8b5acedfa0c1b7c24253e"} Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.361300 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.362755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"6e0f680e79bb8afab38cf17b90ba7154adcea3356785931d4b932c63706aaed5"} Feb 27 00:25:38 crc kubenswrapper[4781]: E0227 00:25:38.364439 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.381141 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" podStartSLOduration=6.938862835 podStartE2EDuration="23.381122412s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.216268503 +0000 UTC m=+1188.473808067" lastFinishedPulling="2026-02-27 00:25:35.65852808 +0000 UTC m=+1204.916067644" observedRunningTime="2026-02-27 00:25:38.37723522 +0000 UTC m=+1207.634774804" watchObservedRunningTime="2026-02-27 00:25:38.381122412 +0000 UTC m=+1207.638661976" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.440773 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" podStartSLOduration=7.087853876 podStartE2EDuration="23.440755987s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.445701094 +0000 UTC m=+1188.703240648" lastFinishedPulling="2026-02-27 00:25:35.798603205 +0000 UTC m=+1205.056142759" observedRunningTime="2026-02-27 00:25:38.418983626 +0000 UTC m=+1207.676523180" watchObservedRunningTime="2026-02-27 00:25:38.440755987 +0000 UTC m=+1207.698295541" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.464997 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.418143513 podStartE2EDuration="36.464977623s" podCreationTimestamp="2026-02-27 00:25:02 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.186623605 +0000 UTC m=+1187.444163159" lastFinishedPulling="2026-02-27 00:25:35.233457715 +0000 UTC m=+1204.490997269" observedRunningTime="2026-02-27 00:25:38.4389811 +0000 UTC m=+1207.696520654" watchObservedRunningTime="2026-02-27 00:25:38.464977623 +0000 UTC m=+1207.722517177" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.514241 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=6.919902829 podStartE2EDuration="23.514215965s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.442141521 +0000 UTC m=+1188.699681075" lastFinishedPulling="2026-02-27 00:25:36.036454657 +0000 UTC m=+1205.293994211" observedRunningTime="2026-02-27 00:25:38.510366934 +0000 UTC m=+1207.767906528" watchObservedRunningTime="2026-02-27 00:25:38.514215965 +0000 UTC m=+1207.771755519" Feb 27 00:25:38 crc kubenswrapper[4781]: I0227 00:25:38.544405 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=7.63373501 podStartE2EDuration="23.544186221s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.446663929 +0000 UTC m=+1188.704203483" lastFinishedPulling="2026-02-27 00:25:35.35711514 +0000 UTC m=+1204.614654694" observedRunningTime="2026-02-27 00:25:38.542146688 +0000 UTC m=+1207.799686252" watchObservedRunningTime="2026-02-27 00:25:38.544186221 +0000 UTC m=+1207.801725775" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.375902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerStarted","Data":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.376650 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.378493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerStarted","Data":"b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.378843 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.381071 4781 generic.go:334] "Generic (PLEG): container finished" podID="9c2c498e-52b1-4ee2-bcf8-3599ee89513c" containerID="d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf" exitCode=0 Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.381140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerDied","Data":"d640ffe0e868584caf90344e7b76246aa7b432b903e1a81afcd65d5f36b832bf"} Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.385448 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d499c77-ccba-41d1-9efb-8424fc7e8d0e","Type":"ContainerStarted","Data":"f666762f8af0c431e8b4e9973700280f9534a62e3f6cf2c22edee444abe0c23b"} Feb 27 00:25:39 crc kubenswrapper[4781]: E0227 00:25:39.388834 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="bd103c67-d035-4de1-aba9-667d1eb67813" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.417660 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podStartSLOduration=4.207561082 podStartE2EDuration="42.417608801s" podCreationTimestamp="2026-02-27 00:24:57 +0000 UTC" firstStartedPulling="2026-02-27 00:24:58.735313798 +0000 UTC m=+1167.992853352" lastFinishedPulling="2026-02-27 00:25:36.945361517 +0000 UTC m=+1206.202901071" observedRunningTime="2026-02-27 00:25:39.4061478 +0000 UTC m=+1208.663687344" watchObservedRunningTime="2026-02-27 00:25:39.417608801 +0000 UTC m=+1208.675148365" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.432459 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.111983561 podStartE2EDuration="28.43243731s" podCreationTimestamp="2026-02-27 00:25:11 +0000 UTC" firstStartedPulling="2026-02-27 00:25:21.71607066 +0000 UTC m=+1190.973610214" lastFinishedPulling="2026-02-27 00:25:36.036524409 +0000 UTC m=+1205.294063963" observedRunningTime="2026-02-27 00:25:39.427379797 +0000 UTC m=+1208.684919351" watchObservedRunningTime="2026-02-27 00:25:39.43243731 +0000 UTC m=+1208.689976864" Feb 27 00:25:39 crc kubenswrapper[4781]: I0227 00:25:39.503774 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podStartSLOduration=4.287085078 podStartE2EDuration="41.503759681s" podCreationTimestamp="2026-02-27 00:24:58 +0000 UTC" firstStartedPulling="2026-02-27 00:24:59.112072484 +0000 UTC m=+1168.369612038" lastFinishedPulling="2026-02-27 00:25:36.328747087 +0000 UTC m=+1205.586286641" observedRunningTime="2026-02-27 00:25:39.50105867 +0000 UTC m=+1208.758598224" watchObservedRunningTime="2026-02-27 00:25:39.503759681 +0000 UTC m=+1208.761299235" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.174330 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.394105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.396020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.399786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"32c722853a219e907aa6805e822320dfcad64f8bcea351eff620f51e0edb3944"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.399832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hcb9s" event={"ID":"9c2c498e-52b1-4ee2-bcf8-3599ee89513c","Type":"ContainerStarted","Data":"5b960311cd3cb59db0abef0b4fb7b64b110d4b25173600bae19c5bad1cf9cc14"} Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.400044 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.400076 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:25:40 crc kubenswrapper[4781]: I0227 00:25:40.448098 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hcb9s" podStartSLOduration=17.607914563 podStartE2EDuration="33.448081391s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:20.015815443 +0000 UTC m=+1189.273354997" lastFinishedPulling="2026-02-27 00:25:35.855982271 +0000 UTC m=+1205.113521825" observedRunningTime="2026-02-27 00:25:40.442501055 +0000 UTC m=+1209.700040619" watchObservedRunningTime="2026-02-27 00:25:40.448081391 +0000 UTC m=+1209.705620945" Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.413062 4781 generic.go:334] "Generic (PLEG): container finished" podID="d59d3864-af0d-407c-8431-ae2e17e4b46f" containerID="5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9" exitCode=0 Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.413130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerDied","Data":"5bbf5b24d4d9008f5b75e4c0fa41c61058490ccdd91cacafc1c8d754b500b1d9"} Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.415301 4781 generic.go:334] "Generic (PLEG): container finished" podID="22624edd-e366-4aff-84dd-c3cec89c0591" containerID="560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552" exitCode=0 Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.415328 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerDied","Data":"560bdbc40641b47d07958de312ae75dadef9033fa90a1674b4544f472b537552"} Feb 27 00:25:42 crc kubenswrapper[4781]: I0227 00:25:42.510678 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.174349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.211384 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.425374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"22624edd-e366-4aff-84dd-c3cec89c0591","Type":"ContainerStarted","Data":"833f840d5f18e25dc205e5a7d1f612bc756ef91d8c591872057d4535f5197ff3"} Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.427875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d59d3864-af0d-407c-8431-ae2e17e4b46f","Type":"ContainerStarted","Data":"ba009ef21836006b9ce41f8f3fecd84b090a6860eb2edc515cf9e9ff611d75de"} Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.451911 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.250317561 podStartE2EDuration="43.451891803s" podCreationTimestamp="2026-02-27 00:25:00 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.457754399 +0000 UTC m=+1187.715293953" lastFinishedPulling="2026-02-27 00:25:35.659328651 +0000 UTC m=+1204.916868195" observedRunningTime="2026-02-27 00:25:43.443211185 +0000 UTC m=+1212.700750779" watchObservedRunningTime="2026-02-27 00:25:43.451891803 +0000 UTC m=+1212.709431367" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.471170 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.298542248 podStartE2EDuration="44.471153149s" podCreationTimestamp="2026-02-27 00:24:59 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.184688654 +0000 UTC m=+1187.442228208" lastFinishedPulling="2026-02-27 00:25:35.357299555 +0000 UTC m=+1204.614839109" observedRunningTime="2026-02-27 00:25:43.465293785 +0000 UTC m=+1212.722833369" watchObservedRunningTime="2026-02-27 00:25:43.471153149 +0000 UTC m=+1212.728692713" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.485527 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.787006 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.787204 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-86z46" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" containerID="cri-o://60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" gracePeriod=10 Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.816056 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.817429 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.821971 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.834929 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.907559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.913959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.914993 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.924238 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 27 00:25:43 crc kubenswrapper[4781]: I0227 00:25:43.938246 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011086 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011178 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012170 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.011325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.012911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.013443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.035861 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"dnsmasq-dns-7f896c8c65-xl8n7\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114945 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.114969 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115285 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovn-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.115330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cf463d95-25dd-4b99-afb0-dac99157c5fa-ovs-rundir\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.116421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf463d95-25dd-4b99-afb0-dac99157c5fa-config\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.118755 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.134897 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.136430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf463d95-25dd-4b99-afb0-dac99157c5fa-combined-ca-bundle\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.140751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwsx\" (UniqueName: \"kubernetes.io/projected/cf463d95-25dd-4b99-afb0-dac99157c5fa-kube-api-access-rrwsx\") pod \"ovn-controller-metrics-hx85z\" (UID: \"cf463d95-25dd-4b99-afb0-dac99157c5fa\") " pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.263506 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.263727 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" containerID="cri-o://b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" gracePeriod=10 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.273140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.277522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hx85z" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.307530 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.321329 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.325939 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.351444 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.425879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.444310 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.456571 4781 generic.go:334] "Generic (PLEG): container finished" podID="0f2c76ec-cfab-4f18-b624-722021700885" containerID="b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" exitCode=0 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.456769 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.465798 4781 generic.go:334] "Generic (PLEG): container finished" podID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" exitCode=0 Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468169 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-86z46" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-86z46" event={"ID":"12142f3c-5849-4af1-8c9e-c92304d3c375","Type":"ContainerDied","Data":"719bdff771357569191b4394e9f9c49ad9d8c251f65a73dabcbbf8b3ba8bdaa2"} Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.468567 4781 scope.go:117] "RemoveContainer" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.514824 4781 scope.go:117] "RemoveContainer" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.528973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529241 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") pod \"12142f3c-5849-4af1-8c9e-c92304d3c375\" (UID: \"12142f3c-5849-4af1-8c9e-c92304d3c375\") " Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.529937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531185 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.531291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.536872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b" (OuterVolumeSpecName: "kube-api-access-8gk5b") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "kube-api-access-8gk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.542419 4781 scope.go:117] "RemoveContainer" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.545952 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": container with ID starting with 60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0 not found: ID does not exist" containerID="60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546012 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0"} err="failed to get container status \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": rpc error: code = NotFound desc = could not find container \"60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0\": container with ID starting with 60dc5411409f46143d07ac6ec21c1dbecfd87e552b39e9e423583cb105c9c9c0 not found: ID does not exist" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546043 4781 scope.go:117] "RemoveContainer" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.546483 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": container with ID starting with 27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43 not found: ID does not exist" containerID="27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.546575 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43"} err="failed to get container status \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": rpc error: code = NotFound desc = could not find container \"27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43\": container with ID starting with 27561c2bc3f7ceff4782a59063d870210455fb0d34834350e478532c8d854d43 not found: ID does not exist" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.551453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"dnsmasq-dns-86db49b7ff-4p8q8\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.593988 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.597043 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config" (OuterVolumeSpecName: "config") pod "12142f3c-5849-4af1-8c9e-c92304d3c375" (UID: "12142f3c-5849-4af1-8c9e-c92304d3c375"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631446 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gk5b\" (UniqueName: \"kubernetes.io/projected/12142f3c-5849-4af1-8c9e-c92304d3c375-kube-api-access-8gk5b\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631473 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.631483 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12142f3c-5849-4af1-8c9e-c92304d3c375-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.730016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.850696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.872707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.879715 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-86z46"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.905696 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921229 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.921580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="init" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921598 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="init" Feb 27 00:25:44 crc kubenswrapper[4781]: E0227 00:25:44.921616 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921634 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.921797 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" containerName="dnsmasq-dns" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.922664 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:44 crc kubenswrapper[4781]: I0227 00:25:44.966052 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.049529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.049966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050025 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.050077 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.153815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.154681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.154687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.155221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.158680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.168897 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hx85z"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.177402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"dnsmasq-dns-698758b865-gc44h\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.186858 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:45 crc kubenswrapper[4781]: W0227 00:25:45.190120 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf463d95_25dd_4b99_afb0_dac99157c5fa.slice/crio-691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62 WatchSource:0}: Error finding container 691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62: Status 404 returned error can't find the container with id 691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62 Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.327264 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12142f3c-5849-4af1-8c9e-c92304d3c375" path="/var/lib/kubelet/pods/12142f3c-5849-4af1-8c9e-c92304d3c375/volumes" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.356822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.357139 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.357167 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") pod \"0f2c76ec-cfab-4f18-b624-722021700885\" (UID: \"0f2c76ec-cfab-4f18-b624-722021700885\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.363386 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9" (OuterVolumeSpecName: "kube-api-access-fw2z9") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "kube-api-access-fw2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.415191 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config" (OuterVolumeSpecName: "config") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.417177 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f2c76ec-cfab-4f18-b624-722021700885" (UID: "0f2c76ec-cfab-4f18-b624-722021700885"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459526 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459567 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2z9\" (UniqueName: \"kubernetes.io/projected/0f2c76ec-cfab-4f18-b624-722021700885-kube-api-access-fw2z9\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.459579 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2c76ec-cfab-4f18-b624-722021700885-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476148 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" event={"ID":"0f2c76ec-cfab-4f18-b624-722021700885","Type":"ContainerDied","Data":"893d8f643c3cbe02cd19b59bf3115d432c587df0f05ea410ba0d0253101d7031"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476210 4781 scope.go:117] "RemoveContainer" containerID="b86f56bb626afb20727e0941782a4964e5b9d6b09cdb0c43236b0144e5335336" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.476320 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9h55" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.482161 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.497993 4781 generic.go:334] "Generic (PLEG): container finished" podID="9c3386c0-cab4-47ef-b395-af90773f2796" containerID="8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991" exitCode=0 Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.498083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerDied","Data":"8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.498110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerStarted","Data":"a67b45e33a1d1e91736f217aef1a74906650aa1373c29592e7ce67091189b528"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.507050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hx85z" event={"ID":"cf463d95-25dd-4b99-afb0-dac99157c5fa","Type":"ContainerStarted","Data":"691968953b9bd97fdf1a1b394531474f9e18fddde0d7c4e11d17dbcfe9a75c62"} Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.530765 4781 scope.go:117] "RemoveContainer" containerID="2b13d210b4d0aa474faa06d53125b956502bdd360564f39dba668453c12f7cac" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.555531 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.569380 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9h55"] Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.577552 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:45 crc kubenswrapper[4781]: W0227 00:25:45.622032 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e0e3c40_86af_4986_bf58_fa79ce187828.slice/crio-5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f WatchSource:0}: Error finding container 5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f: Status 404 returned error can't find the container with id 5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.886722 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.975170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.975586 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.976288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.976341 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") pod \"9c3386c0-cab4-47ef-b395-af90773f2796\" (UID: \"9c3386c0-cab4-47ef-b395-af90773f2796\") " Feb 27 00:25:45 crc kubenswrapper[4781]: I0227 00:25:45.985846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb" (OuterVolumeSpecName: "kube-api-access-rxvhb") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "kube-api-access-rxvhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.000362 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.006232 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config" (OuterVolumeSpecName: "config") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.014551 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c3386c0-cab4-47ef-b395-af90773f2796" (UID: "9c3386c0-cab4-47ef-b395-af90773f2796"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.078977 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvhb\" (UniqueName: \"kubernetes.io/projected/9c3386c0-cab4-47ef-b395-af90773f2796-kube-api-access-rxvhb\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079015 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079028 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.079060 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3386c0-cab4-47ef-b395-af90773f2796-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.107196 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.130575 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131593 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131655 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131719 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131728 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.131755 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.131763 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.132076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" containerName="init" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.132127 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2c76ec-cfab-4f18-b624-722021700885" containerName="dnsmasq-dns" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.141295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146158 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146422 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lg72x" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.146574 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.147532 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.149681 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.285980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.286163 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.387846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.387905 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388085 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.388218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388893 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388914 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.388962 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:46.888942773 +0000 UTC m=+1216.146482327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.389280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-cache\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.389837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-lock\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.393040 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.398584 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.398635 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd48650d90eb686d90983f7c34ee50ce064964e161dc3cb092803e098630b47a/globalmount\"" pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.403223 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7d6\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-kube-api-access-vc7d6\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.428621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60f728cf-70d2-4c10-ab5d-703f88ca79e1\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.517699 4781 generic.go:334] "Generic (PLEG): container finished" podID="58009056-4183-4017-bfa1-c14ce28b92ea" containerID="f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.517792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerDied","Data":"f357b9c4effa13cc010fc9a965fa4ab73f4546260126cc8790d5d9d2d0f6d40a"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" event={"ID":"9c3386c0-cab4-47ef-b395-af90773f2796","Type":"ContainerDied","Data":"a67b45e33a1d1e91736f217aef1a74906650aa1373c29592e7ce67091189b528"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520619 4781 scope.go:117] "RemoveContainer" containerID="8a7c06cfb85e979a54da7be5977198a52079b1205854f4d374a36e5f20b42991" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.520564 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xl8n7" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.526404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerStarted","Data":"e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.533025 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.533312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536167 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerID="aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86" exitCode=0 Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.536332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerStarted","Data":"5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.546110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hx85z" event={"ID":"cf463d95-25dd-4b99-afb0-dac99157c5fa","Type":"ContainerStarted","Data":"81e77b5bfd1fd87b6a34ee8c16757e47b1ff59fe040db1ca38dbbecf4c29c9ba"} Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.548232 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.551508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.553499 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.555471 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.555484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.556220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.580696 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hx85z" podStartSLOduration=3.5806029329999998 podStartE2EDuration="3.580602933s" podCreationTimestamp="2026-02-27 00:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:46.569590713 +0000 UTC m=+1215.827130277" watchObservedRunningTime="2026-02-27 00:25:46.580602933 +0000 UTC m=+1215.838142497" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.686059 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693378 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693475 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693613 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693687 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.693837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.706165 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xl8n7"] Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795653 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795714 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795783 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.795914 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796738 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.796862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.797057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.799684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.800938 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.801173 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.813865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"swift-ring-rebalance-6n9rn\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.854614 4781 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 27 00:25:46 crc kubenswrapper[4781]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 00:25:46 crc kubenswrapper[4781]: > podSandboxID="5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.854841 4781 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 00:25:46 crc kubenswrapper[4781]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4cks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-4p8q8_openstack(4e0e3c40-86af-4986-bf58-fa79ce187828): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 00:25:46 crc kubenswrapper[4781]: > logger="UnhandledError" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.856106 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.868501 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:25:46 crc kubenswrapper[4781]: I0227 00:25:46.902143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902371 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902406 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:46 crc kubenswrapper[4781]: E0227 00:25:46.902481 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:47.902459898 +0000 UTC m=+1217.159999452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.352869 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2c76ec-cfab-4f18-b624-722021700885" path="/var/lib/kubelet/pods/0f2c76ec-cfab-4f18-b624-722021700885/volumes" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.355329 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3386c0-cab4-47ef-b395-af90773f2796" path="/var/lib/kubelet/pods/9c3386c0-cab4-47ef-b395-af90773f2796/volumes" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.388282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6n9rn"] Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.508622 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e37b0a7_69ac_439e_9c5a_207210fe40c8.slice/crio-conmon-31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e37b0a7_69ac_439e_9c5a_207210fe40c8.slice/crio-31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.566560 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerID="31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780" exitCode=0 Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.566659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780"} Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.568698 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerStarted","Data":"ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086"} Feb 27 00:25:47 crc kubenswrapper[4781]: I0227 00:25:47.935035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935252 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935459 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:47 crc kubenswrapper[4781]: E0227 00:25:47.935518 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:49.935501207 +0000 UTC m=+1219.193040761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.580152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" event={"ID":"877c39ec-0202-4987-b6e7-4fb90c4dc9b5","Type":"ContainerStarted","Data":"d0f2b15fcdbb4ac98c885abfd6fcaf06d850f08cfcf87c7b00d69c42cb65e362"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.581023 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.585932 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerStarted","Data":"5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.586245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.590636 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"42503ae1-b143-45c3-8789-e2d1f72cc335","Type":"ContainerStarted","Data":"26ec8ae768d4844b138da07d723a9517917ce9493f3ade7645757aaba568e403"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.590929 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.596566 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerStarted","Data":"e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5"} Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.597135 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.607340 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" podStartSLOduration=5.629067275 podStartE2EDuration="33.607329735s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.466931981 +0000 UTC m=+1188.724471545" lastFinishedPulling="2026-02-27 00:25:47.445194451 +0000 UTC m=+1216.702734005" observedRunningTime="2026-02-27 00:25:48.607055088 +0000 UTC m=+1217.864594682" watchObservedRunningTime="2026-02-27 00:25:48.607329735 +0000 UTC m=+1217.864869289" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.623005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-bxttl" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.635486 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372003.219313 podStartE2EDuration="33.635463904s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.492249535 +0000 UTC m=+1188.749789089" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.625699947 +0000 UTC m=+1217.883239501" watchObservedRunningTime="2026-02-27 00:25:48.635463904 +0000 UTC m=+1217.893003458" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.642670 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podStartSLOduration=4.642642232 podStartE2EDuration="4.642642232s" podCreationTimestamp="2026-02-27 00:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.642076407 +0000 UTC m=+1217.899615961" watchObservedRunningTime="2026-02-27 00:25:48.642642232 +0000 UTC m=+1217.900181786" Feb 27 00:25:48 crc kubenswrapper[4781]: I0227 00:25:48.667194 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gc44h" podStartSLOduration=4.667176936 podStartE2EDuration="4.667176936s" podCreationTimestamp="2026-02-27 00:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:48.663829328 +0000 UTC m=+1217.921368882" watchObservedRunningTime="2026-02-27 00:25:48.667176936 +0000 UTC m=+1217.924716490" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.026222 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.026562 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.027197 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: E0227 00:25:50.027310 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:25:54.027279956 +0000 UTC m=+1223.284819510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.634539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" event={"ID":"233250c8-3871-43ec-8c1d-47bd1d3133e1","Type":"ContainerStarted","Data":"ab5e0848a0d2d1d0af3adfd42b67a8503712e61416ac52d021b088ec1c10cde8"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.635252 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.640166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerStarted","Data":"59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.640405 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.646089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" event={"ID":"d9e3acc2-cee4-4bfe-af04-3a64041fc327","Type":"ContainerStarted","Data":"2a2354aca49bd460af5e535cec347a181edf8596ec560070d98bfc1ab2bff2e0"} Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.646507 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.653176 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" podStartSLOduration=-9223372001.20162 podStartE2EDuration="35.65315532s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.459858315 +0000 UTC m=+1188.717397869" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:50.650423908 +0000 UTC m=+1219.907963462" watchObservedRunningTime="2026-02-27 00:25:50.65315532 +0000 UTC m=+1219.910694874" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.663817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-mj87x" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.675691 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" podStartSLOduration=-9223372001.179104 podStartE2EDuration="35.675672791s" podCreationTimestamp="2026-02-27 00:25:15 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.447545972 +0000 UTC m=+1188.705085526" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:25:50.672752624 +0000 UTC m=+1219.930292178" watchObservedRunningTime="2026-02-27 00:25:50.675672791 +0000 UTC m=+1219.933212365" Feb 27 00:25:50 crc kubenswrapper[4781]: I0227 00:25:50.705029 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.707234587 podStartE2EDuration="46.705011471s" podCreationTimestamp="2026-02-27 00:25:04 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.862539911 +0000 UTC m=+1188.120079465" lastFinishedPulling="2026-02-27 00:25:49.860316795 +0000 UTC m=+1219.117856349" observedRunningTime="2026-02-27 00:25:50.691788264 +0000 UTC m=+1219.949327818" watchObservedRunningTime="2026-02-27 00:25:50.705011471 +0000 UTC m=+1219.962551025" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.068393 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.068448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.186340 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.654793 4781 generic.go:334] "Generic (PLEG): container finished" podID="919ba171-1971-416c-99c1-5dfcacc10a28" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" exitCode=0 Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.654854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} Feb 27 00:25:51 crc kubenswrapper[4781]: I0227 00:25:51.736343 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.245177 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.245591 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.325053 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.732872 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.860056 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.861412 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.872155 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.873346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.877194 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.891601 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.911170 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.993996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:52 crc kubenswrapper[4781]: I0227 00:25:52.994370 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.096986 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.097837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.129740 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"glance-db-create-jrxqx\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.129930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"glance-8c9b-account-create-update-d29bm\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.201617 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.285849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.574131 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.575272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.585487 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.648426 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.650043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.652743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.663214 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.675079 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerID="592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91" exitCode=0 Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.675115 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91"} Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.706806 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.706860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.707935 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.708173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810816 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.810852 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.811455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.812310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.827303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"keystone-5255-account-create-update-k87hd\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.830861 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"keystone-db-create-f66vm\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.900213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.911171 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.912707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.920464 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.964591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.974341 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:25:53 crc kubenswrapper[4781]: I0227 00:25:53.992480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.005817 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.008396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.017748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018148 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.018312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.119661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.119879 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.120075 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: E0227 00:25:54.120129 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:26:02.120110236 +0000 UTC m=+1231.377649790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120044 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.120415 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.121066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.121129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.136423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"placement-db-create-94sd2\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.137655 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"placement-547a-account-create-update-tf2pb\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.231015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.319679 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:25:54 crc kubenswrapper[4781]: I0227 00:25:54.732773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.484810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.556762 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb" event={"ID":"092921e0-a033-4021-b0f5-9c89de3aa830","Type":"ContainerStarted","Data":"c6cb33670dd8b2bfdec6a38612fc1c0b4ed5fb2f72c14df8421862c9e5769e03"} Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693278 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" containerID="cri-o://5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" gracePeriod=10 Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.693698 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9zkpb" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.828145 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-nqbgf" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.843046 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9zkpb" podStartSLOduration=17.842277554 podStartE2EDuration="48.843023726s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.862978573 +0000 UTC m=+1188.120518127" lastFinishedPulling="2026-02-27 00:25:49.863724745 +0000 UTC m=+1219.121264299" observedRunningTime="2026-02-27 00:25:55.719560696 +0000 UTC m=+1224.977100250" watchObservedRunningTime="2026-02-27 00:25:55.843023726 +0000 UTC m=+1225.100563300" Feb 27 00:25:55 crc kubenswrapper[4781]: I0227 00:25:55.955221 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-whkj4" Feb 27 00:25:56 crc kubenswrapper[4781]: I0227 00:25:56.705548 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerID="5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" exitCode=0 Feb 27 00:25:56 crc kubenswrapper[4781]: I0227 00:25:56.705642 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b"} Feb 27 00:25:57 crc kubenswrapper[4781]: I0227 00:25:57.057391 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:25:57 crc kubenswrapper[4781]: I0227 00:25:57.106082 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.405538 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.409077 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.412596 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.426963 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.545731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.545818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.648425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.648504 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.649428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.672853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"root-account-create-update-mmj84\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " pod="openstack/root-account-create-update-mmj84" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.731495 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Feb 27 00:25:59 crc kubenswrapper[4781]: I0227 00:25:59.752401 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.136974 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.138257 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.140970 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.141507 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.141716 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.154240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.258425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.361540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.378554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"auto-csr-approver-29535866-qpv8l\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.466265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:00 crc kubenswrapper[4781]: I0227 00:26:00.956231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.011297 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080795 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080826 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080892 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.080968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.081040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") pod \"4e0e3c40-86af-4986-bf58-fa79ce187828\" (UID: \"4e0e3c40-86af-4986-bf58-fa79ce187828\") " Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.145898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks" (OuterVolumeSpecName: "kube-api-access-j4cks") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "kube-api-access-j4cks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.183725 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4cks\" (UniqueName: \"kubernetes.io/projected/4e0e3c40-86af-4986-bf58-fa79ce187828-kube-api-access-j4cks\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.278767 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.428905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.434751 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.473170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.483973 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config" (OuterVolumeSpecName: "config") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489728 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489757 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.489771 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.515743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e0e3c40-86af-4986-bf58-fa79ce187828" (UID: "4e0e3c40-86af-4986-bf58-fa79ce187828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.591677 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e0e3c40-86af-4986-bf58-fa79ce187828-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.755051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerStarted","Data":"36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.757354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerStarted","Data":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.757686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.761271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerStarted","Data":"84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.762051 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.765657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd103c67-d035-4de1-aba9-667d1eb67813","Type":"ContainerStarted","Data":"90aac18f99b1b6de59145c17a7f71bca0b8c502500fa3e1f4423e423d4f545c8"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.775997 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.785966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerStarted","Data":"7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" event={"ID":"4e0e3c40-86af-4986-bf58-fa79ce187828","Type":"ContainerDied","Data":"5f275f16796eb838b7768e9505dd01e92ce2776a3d08fa355285fb9c70fdc93f"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791179 4781 scope.go:117] "RemoveContainer" containerID="5e167781defb17ca0ec0a66f7db9a5b9464f7410f615b9d338fc5bcbdaa6963b" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.791278 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-4p8q8" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.792906 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.803866 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.818144 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.822545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"e218b31bb4894325c8b13e3e40f2780bfeb716eee793f6e19a4b09e747c5de90"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.824277 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.55299725 podStartE2EDuration="1m4.824259858s" podCreationTimestamp="2026-02-27 00:24:57 +0000 UTC" firstStartedPulling="2026-02-27 00:25:04.277396166 +0000 UTC m=+1173.534935720" lastFinishedPulling="2026-02-27 00:25:17.548658774 +0000 UTC m=+1186.806198328" observedRunningTime="2026-02-27 00:26:01.784448683 +0000 UTC m=+1231.041988237" watchObservedRunningTime="2026-02-27 00:26:01.824259858 +0000 UTC m=+1231.081799402" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.827151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerStarted","Data":"a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.827204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerStarted","Data":"1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.836775 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.838810 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.058098145 podStartE2EDuration="54.83878956s" podCreationTimestamp="2026-02-27 00:25:07 +0000 UTC" firstStartedPulling="2026-02-27 00:25:19.000926433 +0000 UTC m=+1188.258465997" lastFinishedPulling="2026-02-27 00:26:00.781617858 +0000 UTC m=+1230.039157412" observedRunningTime="2026-02-27 00:26:01.831117918 +0000 UTC m=+1231.088657472" watchObservedRunningTime="2026-02-27 00:26:01.83878956 +0000 UTC m=+1231.096329114" Feb 27 00:26:01 crc kubenswrapper[4781]: W0227 00:26:01.841076 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4edac_acb6_4906_9b3b_42b7c7a98943.slice/crio-557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0 WatchSource:0}: Error finding container 557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0: Status 404 returned error can't find the container with id 557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0 Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.841252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerStarted","Data":"b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36"} Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.841308 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerStarted","Data":"0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4"} Feb 27 00:26:01 crc kubenswrapper[4781]: W0227 00:26:01.857786 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cec0cd3_abcd_484c_85b8_03a44888a9b7.slice/crio-8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a WatchSource:0}: Error finding container 8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a: Status 404 returned error can't find the container with id 8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.887279 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.936640 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=63.936606716 podStartE2EDuration="1m3.936606716s" podCreationTimestamp="2026-02-27 00:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.865462379 +0000 UTC m=+1231.123001943" watchObservedRunningTime="2026-02-27 00:26:01.936606716 +0000 UTC m=+1231.194146270" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.952602 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-8c9b-account-create-update-d29bm" podStartSLOduration=9.952579915 podStartE2EDuration="9.952579915s" podCreationTimestamp="2026-02-27 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.885492525 +0000 UTC m=+1231.143032079" watchObservedRunningTime="2026-02-27 00:26:01.952579915 +0000 UTC m=+1231.210119469" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.968062 4781 scope.go:117] "RemoveContainer" containerID="aabc0546990906abca2b35d28ee124be17d9e0ff7483abdc7c60ccee1cba5f86" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.980383 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-jrxqx" podStartSLOduration=9.980362965 podStartE2EDuration="9.980362965s" podCreationTimestamp="2026-02-27 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:01.923460471 +0000 UTC m=+1231.181000045" watchObservedRunningTime="2026-02-27 00:26:01.980362965 +0000 UTC m=+1231.237902519" Feb 27 00:26:01 crc kubenswrapper[4781]: I0227 00:26:01.989410 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6n9rn" podStartSLOduration=2.7783356230000003 podStartE2EDuration="15.989392351s" podCreationTimestamp="2026-02-27 00:25:46 +0000 UTC" firstStartedPulling="2026-02-27 00:25:47.439319026 +0000 UTC m=+1216.696858590" lastFinishedPulling="2026-02-27 00:26:00.650375754 +0000 UTC m=+1229.907915318" observedRunningTime="2026-02-27 00:26:01.945269584 +0000 UTC m=+1231.202809138" watchObservedRunningTime="2026-02-27 00:26:01.989392351 +0000 UTC m=+1231.246931905" Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.002915 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.014597 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-4p8q8"] Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.217069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217270 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217283 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: E0227 00:26:02.217330 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift podName:fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11 nodeName:}" failed. No retries permitted until 2026-02-27 00:26:18.217315802 +0000 UTC m=+1247.474855356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift") pod "swift-storage-0" (UID: "fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11") : configmap "swift-ring-files" not found Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.676251 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.865862 4781 generic.go:334] "Generic (PLEG): container finished" podID="f1713962-9458-45b2-9f28-61409b7ff581" containerID="b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.865946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerDied","Data":"b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.908553 4781 generic.go:334] "Generic (PLEG): container finished" podID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerID="a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.908688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerDied","Data":"a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916005 4781 generic.go:334] "Generic (PLEG): container finished" podID="2c6016e5-2641-4b82-b164-121ae822f863" containerID="b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916122 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerDied","Data":"b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.916153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerStarted","Data":"c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918094 4781 generic.go:334] "Generic (PLEG): container finished" podID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerID="4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918150 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerDied","Data":"4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.918170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerStarted","Data":"5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920505 4781 generic.go:334] "Generic (PLEG): container finished" podID="c986902c-3a54-4300-a078-2e70d305e97e" containerID="3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920564 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerDied","Data":"3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.920689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerStarted","Data":"6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.923985 4781 generic.go:334] "Generic (PLEG): container finished" podID="6bdd8664-6d91-4616-8095-f44067fdca51" containerID="8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.924052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerDied","Data":"8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925567 4781 generic.go:334] "Generic (PLEG): container finished" podID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerID="74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44" exitCode=0 Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerDied","Data":"74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.925688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerStarted","Data":"8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a"} Feb 27 00:26:02 crc kubenswrapper[4781]: I0227 00:26:02.941077 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerStarted","Data":"557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0"} Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.322066 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" path="/var/lib/kubelet/pods/4e0e3c40-86af-4986-bf58-fa79ce187828/volumes" Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.675956 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.944155 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1"} Feb 27 00:26:03 crc kubenswrapper[4781]: I0227 00:26:03.949216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerStarted","Data":"28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.318092 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.367081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") pod \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.367216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") pod \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\" (UID: \"0cec0cd3-abcd-484c-85b8-03a44888a9b7\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.370478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cec0cd3-abcd-484c-85b8-03a44888a9b7" (UID: "0cec0cd3-abcd-484c-85b8-03a44888a9b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.400849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz" (OuterVolumeSpecName: "kube-api-access-dpskz") pod "0cec0cd3-abcd-484c-85b8-03a44888a9b7" (UID: "0cec0cd3-abcd-484c-85b8-03a44888a9b7"). InnerVolumeSpecName "kube-api-access-dpskz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.471930 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpskz\" (UniqueName: \"kubernetes.io/projected/0cec0cd3-abcd-484c-85b8-03a44888a9b7-kube-api-access-dpskz\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.471964 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cec0cd3-abcd-484c-85b8-03a44888a9b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.597685 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.612227 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") pod \"f1713962-9458-45b2-9f28-61409b7ff581\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") pod \"f1713962-9458-45b2-9f28-61409b7ff581\" (UID: \"f1713962-9458-45b2-9f28-61409b7ff581\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675370 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") pod \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675423 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") pod \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\" (UID: \"c90ad80e-9897-4e20-b9b0-6add43c84bd0\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.675659 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1713962-9458-45b2-9f28-61409b7ff581" (UID: "f1713962-9458-45b2-9f28-61409b7ff581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.676114 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1713962-9458-45b2-9f28-61409b7ff581-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.676742 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c90ad80e-9897-4e20-b9b0-6add43c84bd0" (UID: "c90ad80e-9897-4e20-b9b0-6add43c84bd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.680827 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx" (OuterVolumeSpecName: "kube-api-access-chshx") pod "f1713962-9458-45b2-9f28-61409b7ff581" (UID: "f1713962-9458-45b2-9f28-61409b7ff581"). InnerVolumeSpecName "kube-api-access-chshx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.681038 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s" (OuterVolumeSpecName: "kube-api-access-j978s") pod "c90ad80e-9897-4e20-b9b0-6add43c84bd0" (UID: "c90ad80e-9897-4e20-b9b0-6add43c84bd0"). InnerVolumeSpecName "kube-api-access-j978s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.764107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.777945 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j978s\" (UniqueName: \"kubernetes.io/projected/c90ad80e-9897-4e20-b9b0-6add43c84bd0-kube-api-access-j978s\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.777998 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chshx\" (UniqueName: \"kubernetes.io/projected/f1713962-9458-45b2-9f28-61409b7ff581-kube-api-access-chshx\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.778012 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90ad80e-9897-4e20-b9b0-6add43c84bd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.840577 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.879361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") pod \"2c6016e5-2641-4b82-b164-121ae822f863\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.879503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") pod \"2c6016e5-2641-4b82-b164-121ae822f863\" (UID: \"2c6016e5-2641-4b82-b164-121ae822f863\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.880051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c6016e5-2641-4b82-b164-121ae822f863" (UID: "2c6016e5-2641-4b82-b164-121ae822f863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.880500 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6016e5-2641-4b82-b164-121ae822f863-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.886378 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp" (OuterVolumeSpecName: "kube-api-access-zfbcp") pod "2c6016e5-2641-4b82-b164-121ae822f863" (UID: "2c6016e5-2641-4b82-b164-121ae822f863"). InnerVolumeSpecName "kube-api-access-zfbcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.886822 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958662 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrxqx" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958900 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrxqx" event={"ID":"f1713962-9458-45b2-9f28-61409b7ff581","Type":"ContainerDied","Data":"0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.958935 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c4e303b9cbf7a244c07b330df27d0efc55bd33315e704ad0931c0431621f3c4" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-547a-account-create-update-tf2pb" event={"ID":"6bdd8664-6d91-4616-8095-f44067fdca51","Type":"ContainerDied","Data":"36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960734 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e84b5ed003c240081e5afc68dd15e5fc943e783f85671eaa2c34c2afee47fd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.960794 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-547a-account-create-update-tf2pb" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f66vm" event={"ID":"0cec0cd3-abcd-484c-85b8-03a44888a9b7","Type":"ContainerDied","Data":"8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963062 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8173fa3b8afd38768e66ef999c136681772389689078482ef577e5ddcd6b877a" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.963112 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f66vm" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.966565 4781 generic.go:334] "Generic (PLEG): container finished" podID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerID="28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8" exitCode=0 Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.966638 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerDied","Data":"28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5255-account-create-update-k87hd" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5255-account-create-update-k87hd" event={"ID":"2c6016e5-2641-4b82-b164-121ae822f863","Type":"ContainerDied","Data":"c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.968553 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5015fd63243058303378753db482675b9cd87268a9cd806c0e30c22950038da" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.970620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"58009056-4183-4017-bfa1-c14ce28b92ea","Type":"ContainerStarted","Data":"08ef63f36bfd20f4318ef29e1e8d3879e1a5dd2fa86a7d1fbfe5e75b632a8837"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.970800 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-94sd2" event={"ID":"c90ad80e-9897-4e20-b9b0-6add43c84bd0","Type":"ContainerDied","Data":"5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22"} Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972070 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b5249fbcfef480f4709ba4ec70ed903f8b332b1831523d320ff5af46111ef22" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.972076 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-94sd2" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984467 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") pod \"6bdd8664-6d91-4616-8095-f44067fdca51\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984874 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bdd8664-6d91-4616-8095-f44067fdca51" (UID: "6bdd8664-6d91-4616-8095-f44067fdca51"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.984613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") pod \"6bdd8664-6d91-4616-8095-f44067fdca51\" (UID: \"6bdd8664-6d91-4616-8095-f44067fdca51\") " Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988143 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfbcp\" (UniqueName: \"kubernetes.io/projected/2c6016e5-2641-4b82-b164-121ae822f863-kube-api-access-zfbcp\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988174 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdd8664-6d91-4616-8095-f44067fdca51-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.988559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5" (OuterVolumeSpecName: "kube-api-access-zlll5") pod "6bdd8664-6d91-4616-8095-f44067fdca51" (UID: "6bdd8664-6d91-4616-8095-f44067fdca51"). InnerVolumeSpecName "kube-api-access-zlll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:04 crc kubenswrapper[4781]: I0227 00:26:04.989443 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.007395 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=17.995921644 podStartE2EDuration="1m0.007377485s" podCreationTimestamp="2026-02-27 00:25:05 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.868418986 +0000 UTC m=+1188.125958540" lastFinishedPulling="2026-02-27 00:26:00.879874827 +0000 UTC m=+1230.137414381" observedRunningTime="2026-02-27 00:26:04.994153906 +0000 UTC m=+1234.251693460" watchObservedRunningTime="2026-02-27 00:26:05.007377485 +0000 UTC m=+1234.264917039" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.015584 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.026349 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.088932 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") pod \"c986902c-3a54-4300-a078-2e70d305e97e\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089055 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") pod \"bb4687ec-812e-48bb-8d53-ed628f3cd013\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089188 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") pod \"c986902c-3a54-4300-a078-2e70d305e97e\" (UID: \"c986902c-3a54-4300-a078-2e70d305e97e\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089301 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") pod \"bb4687ec-812e-48bb-8d53-ed628f3cd013\" (UID: \"bb4687ec-812e-48bb-8d53-ed628f3cd013\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb4687ec-812e-48bb-8d53-ed628f3cd013" (UID: "bb4687ec-812e-48bb-8d53-ed628f3cd013"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.089776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c986902c-3a54-4300-a078-2e70d305e97e" (UID: "c986902c-3a54-4300-a078-2e70d305e97e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.093041 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t" (OuterVolumeSpecName: "kube-api-access-tjn7t") pod "c986902c-3a54-4300-a078-2e70d305e97e" (UID: "c986902c-3a54-4300-a078-2e70d305e97e"). InnerVolumeSpecName "kube-api-access-tjn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095142 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b" (OuterVolumeSpecName: "kube-api-access-l2c6b") pod "bb4687ec-812e-48bb-8d53-ed628f3cd013" (UID: "bb4687ec-812e-48bb-8d53-ed628f3cd013"). InnerVolumeSpecName "kube-api-access-l2c6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095721 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlll5\" (UniqueName: \"kubernetes.io/projected/6bdd8664-6d91-4616-8095-f44067fdca51-kube-api-access-zlll5\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.095883 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjn7t\" (UniqueName: \"kubernetes.io/projected/c986902c-3a54-4300-a078-2e70d305e97e-kube-api-access-tjn7t\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.096017 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb4687ec-812e-48bb-8d53-ed628f3cd013-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.096118 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c986902c-3a54-4300-a078-2e70d305e97e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.198904 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2c6b\" (UniqueName: \"kubernetes.io/projected/bb4687ec-812e-48bb-8d53-ed628f3cd013-kube-api-access-l2c6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.364742 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.503391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") pod \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\" (UID: \"1fe4edac-acb6-4906-9b3b-42b7c7a98943\") " Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.529817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p" (OuterVolumeSpecName: "kube-api-access-7z28p") pod "1fe4edac-acb6-4906-9b3b-42b7c7a98943" (UID: "1fe4edac-acb6-4906-9b3b-42b7c7a98943"). InnerVolumeSpecName "kube-api-access-7z28p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.606821 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z28p\" (UniqueName: \"kubernetes.io/projected/1fe4edac-acb6-4906-9b3b-42b7c7a98943-kube-api-access-7z28p\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.716738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.979938 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmj84" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.979919 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmj84" event={"ID":"c986902c-3a54-4300-a078-2e70d305e97e","Type":"ContainerDied","Data":"6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.980840 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8b1c057f209e42dffb4f53945e78631d479cd15aecac6c6f86368ea8a7b90e" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981715 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8c9b-account-create-update-d29bm" event={"ID":"bb4687ec-812e-48bb-8d53-ed628f3cd013","Type":"ContainerDied","Data":"1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981748 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0a5f73b00e8a9559b8d8378ce7177f49256b9d4979478bb067d80bc66e07fc" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.981767 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8c9b-account-create-update-d29bm" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983099 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535866-qpv8l" event={"ID":"1fe4edac-acb6-4906-9b3b-42b7c7a98943","Type":"ContainerDied","Data":"557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0"} Feb 27 00:26:05 crc kubenswrapper[4781]: I0227 00:26:05.983230 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="557e493fffe4db2dcddedbe073df198bb6b870988be78491ac11e021467b67f0" Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.108359 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f" Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.489728 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.500124 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535860-d6xsb"] Feb 27 00:26:06 crc kubenswrapper[4781]: I0227 00:26:06.952743 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:07 crc kubenswrapper[4781]: I0227 00:26:07.017371 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 27 00:26:07 crc kubenswrapper[4781]: I0227 00:26:07.319508 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ba504f-040f-4632-b5d0-4b28aef8d27e" path="/var/lib/kubelet/pods/c8ba504f-040f-4632-b5d0-4b28aef8d27e/volumes" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.215386 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229419 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229465 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229486 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229494 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229508 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="init" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229518 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="init" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229532 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229542 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229553 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229561 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229585 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229592 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229607 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229614 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229644 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229652 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229663 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229671 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: E0227 00:26:08.229680 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229687 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229951 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229969 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c986902c-3a54-4300-a078-2e70d305e97e" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229983 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0e3c40-86af-4986-bf58-fa79ce187828" containerName="dnsmasq-dns" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.229998 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230008 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" containerName="oc" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230019 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230028 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1713962-9458-45b2-9f28-61409b7ff581" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230036 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" containerName="mariadb-database-create" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230052 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6016e5-2641-4b82-b164-121ae822f863" containerName="mariadb-account-create-update" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230659 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.230755 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.237551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4ql2s" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.237935 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.351880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.351976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.352015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.352107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454039 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.454985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.463431 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.466098 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.470226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.474254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"glance-db-sync-8tmft\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:08 crc kubenswrapper[4781]: I0227 00:26:08.551878 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.738876 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.992152 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:08.998324 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.002825 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f6c72" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003137 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.003503 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.039784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.134538 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.170889 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171185 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.171460 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272786 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272894 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.272979 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.273051 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.273080 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.274080 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.274824 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-config\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.275366 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5923572-3637-49e3-9eea-72e52c5fb88b-scripts\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.281642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.284925 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.284940 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5923572-3637-49e3-9eea-72e52c5fb88b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.295414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp584\" (UniqueName: \"kubernetes.io/projected/d5923572-3637-49e3-9eea-72e52c5fb88b-kube-api-access-cp584\") pod \"ovn-northd-0\" (UID: \"d5923572-3637-49e3-9eea-72e52c5fb88b\") " pod="openstack/ovn-northd-0" Feb 27 00:26:09 crc kubenswrapper[4781]: I0227 00:26:09.335014 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.026109 4781 generic.go:334] "Generic (PLEG): container finished" podID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerID="7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9" exitCode=0 Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.026189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerDied","Data":"7d3236f4301015aa89ce006050dc39e2b0704b179ed73e64d6106302850331f9"} Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.028271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerStarted","Data":"c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d"} Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.047719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 00:26:10 crc kubenswrapper[4781]: W0227 00:26:10.049579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5923572_3637_49e3_9eea_72e52c5fb88b.slice/crio-d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4 WatchSource:0}: Error finding container d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4: Status 404 returned error can't find the container with id d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4 Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.925030 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:10 crc kubenswrapper[4781]: I0227 00:26:10.934765 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mmj84"] Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.042547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"d7cef7c0a0810b567b27259e063c6dad9a86a7be2a05f6f35d2ea8ce4f01bcb4"} Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.331272 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c986902c-3a54-4300-a078-2e70d305e97e" path="/var/lib/kubelet/pods/c986902c-3a54-4300-a078-2e70d305e97e/volumes" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.469963 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620099 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620171 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620271 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620305 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620325 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.620429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") pod \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\" (UID: \"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b\") " Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.621350 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.621575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.625847 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g" (OuterVolumeSpecName: "kube-api-access-k6d9g") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "kube-api-access-k6d9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.645018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.646897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.647693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.659185 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts" (OuterVolumeSpecName: "scripts") pod "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" (UID: "b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722891 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722921 4781 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722936 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6d9g\" (UniqueName: \"kubernetes.io/projected/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-kube-api-access-k6d9g\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722962 4781 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722973 4781 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:11 crc kubenswrapper[4781]: I0227 00:26:11.722982 4781 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.092186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"27c60acd2a3e598cd4eb2f0aca7bc6776567328b53a7fd4e5925a5387dd11dae"} Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6n9rn" event={"ID":"b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b","Type":"ContainerDied","Data":"ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086"} Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094900 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca26accad7ac480d16da11e818bc3769c592f1c77082ff70a7fdd81af22f0086" Feb 27 00:26:12 crc kubenswrapper[4781]: I0227 00:26:12.094983 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6n9rn" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.106607 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d5923572-3637-49e3-9eea-72e52c5fb88b","Type":"ContainerStarted","Data":"5e9e97c2d3c5129072f1159d516f11fdd77eb7aa658b8b7af077d3e387dfebbf"} Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.106964 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.131339 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.386301897 podStartE2EDuration="5.131317099s" podCreationTimestamp="2026-02-27 00:26:08 +0000 UTC" firstStartedPulling="2026-02-27 00:26:10.051988296 +0000 UTC m=+1239.309527840" lastFinishedPulling="2026-02-27 00:26:11.797003488 +0000 UTC m=+1241.054543042" observedRunningTime="2026-02-27 00:26:13.121978302 +0000 UTC m=+1242.379517866" watchObservedRunningTime="2026-02-27 00:26:13.131317099 +0000 UTC m=+1242.388856663" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.333425 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.339096 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hcb9s" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.578800 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:13 crc kubenswrapper[4781]: E0227 00:26:13.579577 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.579676 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.579950 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b" containerName="swift-ring-rebalance" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.580779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.587083 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.592786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.676925 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677503 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.677546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.779964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.780822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.783187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.798787 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"ovn-controller-9zkpb-config-9k74f\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:13 crc kubenswrapper[4781]: I0227 00:26:13.935815 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.925282 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.927822 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.931075 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 27 00:26:15 crc kubenswrapper[4781]: I0227 00:26:15.935331 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.028161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.028372 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.129737 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.129833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.130845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.149703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"root-account-create-update-wxsbg\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.261665 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:16 crc kubenswrapper[4781]: I0227 00:26:16.950965 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:17 crc kubenswrapper[4781]: I0227 00:26:17.546423 4781 scope.go:117] "RemoveContainer" containerID="3eb6fa2c40c5ff8bd90c7472dc3a2b552bb7c38236a559c08d25c903e216a06b" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.271121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.285535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11-etc-swift\") pod \"swift-storage-0\" (UID: \"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11\") " pod="openstack/swift-storage-0" Feb 27 00:26:18 crc kubenswrapper[4781]: I0227 00:26:18.567461 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.365573 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.648662 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.650044 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.677686 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.764341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.805417 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.805506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.808500 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.809944 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.812981 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.833293 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.864753 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.866281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907551 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.907570 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.908845 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.917512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.962416 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"cinder-db-create-zvn4t\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:19 crc kubenswrapper[4781]: I0227 00:26:19.974106 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.008828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.009352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010351 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.010511 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.012362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.039398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"cinder-6e38-account-create-update-dntk2\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.051938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.114929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.115486 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.116383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.138169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"cloudkitty-db-create-99xdp\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.160447 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.190162 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.191377 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.203407 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.204753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.205042 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209314 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209482 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.209541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.215746 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.216872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.216912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.217590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.233301 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.234420 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.239042 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.268801 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"barbican-db-create-m5rm5\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.270751 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.312188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.357083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362961 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.362989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363055 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.363220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.418975 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.461794 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.463313 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469148 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469706 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.469916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.471449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.473144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.476917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.485718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.491651 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"keystone-db-sync-rs9bx\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.492810 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"cloudkitty-a05d-account-create-update-cw8zv\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.495256 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"neutron-db-create-v2g9n\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.496480 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.497698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.499743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.504753 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.534006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571295 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.571339 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.647573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.667006 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673541 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673613 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.673823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.674365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.674505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.689434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"neutron-4bde-account-create-update-fpg2t\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.690072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"barbican-a8a8-account-create-update-vcwwx\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.817457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:20 crc kubenswrapper[4781]: I0227 00:26:20.826832 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.231084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerStarted","Data":"e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a"} Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.236571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerStarted","Data":"0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0"} Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.243998 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.258245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.282362 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8tmft" podStartSLOduration=2.179377493 podStartE2EDuration="17.282340502s" podCreationTimestamp="2026-02-27 00:26:08 +0000 UTC" firstStartedPulling="2026-02-27 00:26:09.150880861 +0000 UTC m=+1238.408420415" lastFinishedPulling="2026-02-27 00:26:24.25384387 +0000 UTC m=+1253.511383424" observedRunningTime="2026-02-27 00:26:25.271422723 +0000 UTC m=+1254.528962277" watchObservedRunningTime="2026-02-27 00:26:25.282340502 +0000 UTC m=+1254.539880056" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.320387 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:26:25 crc kubenswrapper[4781]: W0227 00:26:25.327253 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119c35de_5e7a_4d3f_af8a_3595d7dc69aa.slice/crio-8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b WatchSource:0}: Error finding container 8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b: Status 404 returned error can't find the container with id 8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.347265 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.355676 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.362619 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.411115 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.430201 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.431812 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.120108302 podStartE2EDuration="1m21.43179206s" podCreationTimestamp="2026-02-27 00:25:04 +0000 UTC" firstStartedPulling="2026-02-27 00:25:18.874938737 +0000 UTC m=+1188.132478291" lastFinishedPulling="2026-02-27 00:26:24.186622495 +0000 UTC m=+1253.444162049" observedRunningTime="2026-02-27 00:26:25.328933733 +0000 UTC m=+1254.586473287" watchObservedRunningTime="2026-02-27 00:26:25.43179206 +0000 UTC m=+1254.689331614" Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.447864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.454688 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.461091 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.468242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:26:25 crc kubenswrapper[4781]: I0227 00:26:25.475788 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.098091 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.252250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerStarted","Data":"c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.252485 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerStarted","Data":"30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.255130 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"94beb6abb1958b96717d500a6631fce3acfe4486c10c8cc84b786a985608d0c9"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.258036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerDied","Data":"6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.264921 4781 generic.go:334] "Generic (PLEG): container finished" podID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerID="6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.265057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerStarted","Data":"3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.271606 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-v2g9n" podStartSLOduration=6.271591056 podStartE2EDuration="6.271591056s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.268112854 +0000 UTC m=+1255.525652408" watchObservedRunningTime="2026-02-27 00:26:26.271591056 +0000 UTC m=+1255.529130610" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.275801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerStarted","Data":"08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.275840 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerStarted","Data":"acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.283346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerStarted","Data":"d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.283389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerStarted","Data":"12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286675 4781 generic.go:334] "Generic (PLEG): container finished" podID="24adb929-f812-4243-94ea-23345856d28f" containerID="9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerDied","Data":"9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.286749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerStarted","Data":"afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.292327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerStarted","Data":"beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.292377 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerStarted","Data":"8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297852 4781 generic.go:334] "Generic (PLEG): container finished" podID="388be198-b438-4142-8fb8-ec9831e9a1af" containerID="f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393" exitCode=0 Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerDied","Data":"f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.297949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerStarted","Data":"42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.301668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerStarted","Data":"9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.303220 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-zvn4t" podStartSLOduration=7.303199271 podStartE2EDuration="7.303199271s" podCreationTimestamp="2026-02-27 00:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.295797655 +0000 UTC m=+1255.553337209" watchObservedRunningTime="2026-02-27 00:26:26.303199271 +0000 UTC m=+1255.560738825" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.304780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerStarted","Data":"6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.304813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerStarted","Data":"4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.306184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerStarted","Data":"6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.306208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerStarted","Data":"d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.316749 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4bde-account-create-update-fpg2t" podStartSLOduration=6.316726738 podStartE2EDuration="6.316726738s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.316726248 +0000 UTC m=+1255.574265802" watchObservedRunningTime="2026-02-27 00:26:26.316726738 +0000 UTC m=+1255.574266292" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.325197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerStarted","Data":"297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.325231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerStarted","Data":"d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f"} Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.337308 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9zkpb-config-9k74f" podStartSLOduration=13.337293591 podStartE2EDuration="13.337293591s" podCreationTimestamp="2026-02-27 00:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.335400431 +0000 UTC m=+1255.592939985" watchObservedRunningTime="2026-02-27 00:26:26.337293591 +0000 UTC m=+1255.594833145" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.371850 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-wxsbg" podStartSLOduration=11.371832254 podStartE2EDuration="11.371832254s" podCreationTimestamp="2026-02-27 00:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.369011479 +0000 UTC m=+1255.626551033" watchObservedRunningTime="2026-02-27 00:26:26.371832254 +0000 UTC m=+1255.629371808" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.411144 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-m5rm5" podStartSLOduration=7.411127672 podStartE2EDuration="7.411127672s" podCreationTimestamp="2026-02-27 00:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:26.405253427 +0000 UTC m=+1255.662792981" watchObservedRunningTime="2026-02-27 00:26:26.411127672 +0000 UTC m=+1255.668667226" Feb 27 00:26:26 crc kubenswrapper[4781]: I0227 00:26:26.964679 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="2691e066-2f4c-4e7e-bcac-01933bd6cadb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.339911 4781 generic.go:334] "Generic (PLEG): container finished" podID="e8806487-486f-464d-8249-b6368daabff5" containerID="08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.340002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerDied","Data":"08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.341830 4781 generic.go:334] "Generic (PLEG): container finished" podID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerID="d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.341894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerDied","Data":"d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.343697 4781 generic.go:334] "Generic (PLEG): container finished" podID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerID="297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.343752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerDied","Data":"297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.346172 4781 generic.go:334] "Generic (PLEG): container finished" podID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerID="6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.346241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerDied","Data":"6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.348324 4781 generic.go:334] "Generic (PLEG): container finished" podID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerID="c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.348383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerDied","Data":"c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.365505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f853867bfa4e786c83dc2205099ab25248d1c85dc15de999b04de22c2ab4daf6"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.370366 4781 generic.go:334] "Generic (PLEG): container finished" podID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerID="beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.370459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerDied","Data":"beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172"} Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.376900 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerID="6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e" exitCode=0 Feb 27 00:26:27 crc kubenswrapper[4781]: I0227 00:26:27.377120 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerDied","Data":"6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.389994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f1a98806257acaa19b4b9b86ece8903557df5c930cd956c6c9c8e2bb9dcd294c"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.390425 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"4cc9e5526c315ad1df3a3d8a30dc027e8f03c4458e5f81b28f262a635d116fb9"} Feb 27 00:26:28 crc kubenswrapper[4781]: I0227 00:26:28.412226 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9zkpb" Feb 27 00:26:29 crc kubenswrapper[4781]: I0227 00:26:29.415907 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.350105 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.378038 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.440069 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.465019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a8a8-account-create-update-vcwwx" event={"ID":"388be198-b438-4142-8fb8-ec9831e9a1af","Type":"ContainerDied","Data":"42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.465056 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42d459e8a3a5d054ca75741044d1df38f0502d5c39db8635a77da335cae8d852" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.473831 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.474016 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-m5rm5" event={"ID":"e3aedfe4-2bbb-46c9-97d4-8d6782c44707","Type":"ContainerDied","Data":"4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.474044 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4306df6b2bf7f7647c5412693ad76ea23e53776c7458f39ce7d1dce54668a6a1" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bde-account-create-update-fpg2t" event={"ID":"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf","Type":"ContainerDied","Data":"12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479238 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fea8b77cbfdc1c7a64b2da8721e9a6a6590ac1e465f6d0c6156cdbd111bf81" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.479270 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bde-account-create-update-fpg2t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.489960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6e38-account-create-update-dntk2" event={"ID":"0eb55288-e9bb-46f0-bae3-789e8db036cf","Type":"ContainerDied","Data":"d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.490005 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82ba0a5a4bcff589056e0ce6141c47f60a6cc39ed51e693d6fad4b70237438f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.490203 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") pod \"0eb55288-e9bb-46f0-bae3-789e8db036cf\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") pod \"cafd294d-e929-4cd5-8be3-7175ad4aed09\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508677 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508719 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") pod \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") pod \"0eb55288-e9bb-46f0-bae3-789e8db036cf\" (UID: \"0eb55288-e9bb-46f0-bae3-789e8db036cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508789 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") pod \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\" (UID: \"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508839 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") pod \"cafd294d-e929-4cd5-8be3-7175ad4aed09\" (UID: \"cafd294d-e929-4cd5-8be3-7175ad4aed09\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508942 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") pod \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.508969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.509037 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") pod \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\" (UID: \"119c35de-5e7a-4d3f-af8a-3595d7dc69aa\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.509069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") pod \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\" (UID: \"5b9cc074-4ea1-4c04-9398-5be68fbcd5cf\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run" (OuterVolumeSpecName: "var-run") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510381 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510899 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.510996 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" (UID: "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts" (OuterVolumeSpecName: "scripts") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0eb55288-e9bb-46f0-bae3-789e8db036cf" (UID: "0eb55288-e9bb-46f0-bae3-789e8db036cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.511807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cafd294d-e929-4cd5-8be3-7175ad4aed09" (UID: "cafd294d-e929-4cd5-8be3-7175ad4aed09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.512446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.518795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb" (OuterVolumeSpecName: "kube-api-access-wjdvb") pod "0eb55288-e9bb-46f0-bae3-789e8db036cf" (UID: "0eb55288-e9bb-46f0-bae3-789e8db036cf"). InnerVolumeSpecName "kube-api-access-wjdvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.522810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb" (OuterVolumeSpecName: "kube-api-access-j6wjb") pod "119c35de-5e7a-4d3f-af8a-3595d7dc69aa" (UID: "119c35de-5e7a-4d3f-af8a-3595d7dc69aa"). InnerVolumeSpecName "kube-api-access-j6wjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.523324 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8" (OuterVolumeSpecName: "kube-api-access-5zhb8") pod "cafd294d-e929-4cd5-8be3-7175ad4aed09" (UID: "cafd294d-e929-4cd5-8be3-7175ad4aed09"). InnerVolumeSpecName "kube-api-access-5zhb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.524552 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw" (OuterVolumeSpecName: "kube-api-access-dfkxw") pod "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" (UID: "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf"). InnerVolumeSpecName "kube-api-access-dfkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.524969 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wxsbg" event={"ID":"cafd294d-e929-4cd5-8be3-7175ad4aed09","Type":"ContainerDied","Data":"d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525021 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d777854ef43da96dbfe08c3e3579fc6ad7e045141adca768cf705a1bf88479" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525198 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wxsbg" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.525809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" (UID: "5b9cc074-4ea1-4c04-9398-5be68fbcd5cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.544826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln" (OuterVolumeSpecName: "kube-api-access-jfxln") pod "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" (UID: "3ae26ad0-3770-4153-a1d6-96ae3a9e36a9"). InnerVolumeSpecName "kube-api-access-jfxln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545098 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v2g9n" event={"ID":"3ae26ad0-3770-4153-a1d6-96ae3a9e36a9","Type":"ContainerDied","Data":"30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545129 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f6389e339c637a76ed83badeaae69166aa47b033d4e6235f9518305d3ee600" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.545293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v2g9n" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.558959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-9k74f" event={"ID":"119c35de-5e7a-4d3f-af8a-3595d7dc69aa","Type":"ContainerDied","Data":"8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.559011 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c83495fdbbabbe88d3f2dcacc03d6929c80894b6bc99aa2756ea3f2ad7bcd7b" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.559092 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-9k74f" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610076 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") pod \"388be198-b438-4142-8fb8-ec9831e9a1af\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610422 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") pod \"24adb929-f812-4243-94ea-23345856d28f\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") pod \"388be198-b438-4142-8fb8-ec9831e9a1af\" (UID: \"388be198-b438-4142-8fb8-ec9831e9a1af\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610496 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") pod \"24adb929-f812-4243-94ea-23345856d28f\" (UID: \"24adb929-f812-4243-94ea-23345856d28f\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610776 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610794 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfkxw\" (UniqueName: \"kubernetes.io/projected/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-kube-api-access-dfkxw\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610804 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6wjb\" (UniqueName: \"kubernetes.io/projected/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-kube-api-access-j6wjb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610813 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610821 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610829 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0eb55288-e9bb-46f0-bae3-789e8db036cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610837 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610844 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cafd294d-e929-4cd5-8be3-7175ad4aed09-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610852 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610861 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxln\" (UniqueName: \"kubernetes.io/projected/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-kube-api-access-jfxln\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610871 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdvb\" (UniqueName: \"kubernetes.io/projected/0eb55288-e9bb-46f0-bae3-789e8db036cf-kube-api-access-wjdvb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610878 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/119c35de-5e7a-4d3f-af8a-3595d7dc69aa-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610887 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.610896 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zhb8\" (UniqueName: \"kubernetes.io/projected/cafd294d-e929-4cd5-8be3-7175ad4aed09-kube-api-access-5zhb8\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.611260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24adb929-f812-4243-94ea-23345856d28f" (UID: "24adb929-f812-4243-94ea-23345856d28f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.612891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" event={"ID":"6344c1fe-eecb-4d57-a5c7-a857e4466439","Type":"ContainerDied","Data":"3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.612930 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3156fc4e82e446fff06ff8e16e3fad71473d705c6d619f1c07498df33a7e7f1a" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.614392 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q" (OuterVolumeSpecName: "kube-api-access-6cr8q") pod "24adb929-f812-4243-94ea-23345856d28f" (UID: "24adb929-f812-4243-94ea-23345856d28f"). InnerVolumeSpecName "kube-api-access-6cr8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.614963 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "388be198-b438-4142-8fb8-ec9831e9a1af" (UID: "388be198-b438-4142-8fb8-ec9831e9a1af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.620897 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb" (OuterVolumeSpecName: "kube-api-access-p4dsb") pod "388be198-b438-4142-8fb8-ec9831e9a1af" (UID: "388be198-b438-4142-8fb8-ec9831e9a1af"). InnerVolumeSpecName "kube-api-access-p4dsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.627068 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zvn4t" event={"ID":"e8806487-486f-464d-8249-b6368daabff5","Type":"ContainerDied","Data":"acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.627114 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acca40850a65ffa395f9bc3d270f6d4791d803b54b4d4bdd03263ee754d9ce94" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648879 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-99xdp" event={"ID":"24adb929-f812-4243-94ea-23345856d28f","Type":"ContainerDied","Data":"afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4"} Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648917 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb6fce1583a2499e12761dfb2a4e40745be9e1f5459d6226e780d1b7f1a01e4" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.648977 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-99xdp" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.684831 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.687219 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.696511 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713130 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/388be198-b438-4142-8fb8-ec9831e9a1af-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713240 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cr8q\" (UniqueName: \"kubernetes.io/projected/24adb929-f812-4243-94ea-23345856d28f-kube-api-access-6cr8q\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713267 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dsb\" (UniqueName: \"kubernetes.io/projected/388be198-b438-4142-8fb8-ec9831e9a1af-kube-api-access-p4dsb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.713284 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24adb929-f812-4243-94ea-23345856d28f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") pod \"e8806487-486f-464d-8249-b6368daabff5\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815650 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") pod \"6344c1fe-eecb-4d57-a5c7-a857e4466439\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") pod \"e8806487-486f-464d-8249-b6368daabff5\" (UID: \"e8806487-486f-464d-8249-b6368daabff5\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") pod \"6344c1fe-eecb-4d57-a5c7-a857e4466439\" (UID: \"6344c1fe-eecb-4d57-a5c7-a857e4466439\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815900 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") pod \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.815977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") pod \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\" (UID: \"e3aedfe4-2bbb-46c9-97d4-8d6782c44707\") " Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.816677 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8806487-486f-464d-8249-b6368daabff5" (UID: "e8806487-486f-464d-8249-b6368daabff5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.819946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3aedfe4-2bbb-46c9-97d4-8d6782c44707" (UID: "e3aedfe4-2bbb-46c9-97d4-8d6782c44707"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.824544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6344c1fe-eecb-4d57-a5c7-a857e4466439" (UID: "6344c1fe-eecb-4d57-a5c7-a857e4466439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.824697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr" (OuterVolumeSpecName: "kube-api-access-rmwsr") pod "6344c1fe-eecb-4d57-a5c7-a857e4466439" (UID: "6344c1fe-eecb-4d57-a5c7-a857e4466439"). InnerVolumeSpecName "kube-api-access-rmwsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.835894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm" (OuterVolumeSpecName: "kube-api-access-qh4qm") pod "e8806487-486f-464d-8249-b6368daabff5" (UID: "e8806487-486f-464d-8249-b6368daabff5"). InnerVolumeSpecName "kube-api-access-qh4qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.848472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd" (OuterVolumeSpecName: "kube-api-access-qdgbd") pod "e3aedfe4-2bbb-46c9-97d4-8d6782c44707" (UID: "e3aedfe4-2bbb-46c9-97d4-8d6782c44707"). InnerVolumeSpecName "kube-api-access-qdgbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917591 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8806487-486f-464d-8249-b6368daabff5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917645 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6344c1fe-eecb-4d57-a5c7-a857e4466439-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917659 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917669 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdgbd\" (UniqueName: \"kubernetes.io/projected/e3aedfe4-2bbb-46c9-97d4-8d6782c44707-kube-api-access-qdgbd\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917680 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh4qm\" (UniqueName: \"kubernetes.io/projected/e8806487-486f-464d-8249-b6368daabff5-kube-api-access-qh4qm\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:31 crc kubenswrapper[4781]: I0227 00:26:31.917690 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmwsr\" (UniqueName: \"kubernetes.io/projected/6344c1fe-eecb-4d57-a5c7-a857e4466439-kube-api-access-rmwsr\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.672811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerStarted","Data":"69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc"} Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.680677 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-m5rm5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.681476 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6e38-account-create-update-dntk2" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685461 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"f123a34509614ea32220d16f3dcaae5c63f248b87dbd7f293f38c1d213478e87"} Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685603 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a8a8-account-create-update-vcwwx" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685561 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zvn4t" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.685506 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-a05d-account-create-update-cw8zv" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.702608 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rs9bx" podStartSLOduration=6.885414322 podStartE2EDuration="12.702591104s" podCreationTimestamp="2026-02-27 00:26:20 +0000 UTC" firstStartedPulling="2026-02-27 00:26:25.32014684 +0000 UTC m=+1254.577686394" lastFinishedPulling="2026-02-27 00:26:31.137323622 +0000 UTC m=+1260.394863176" observedRunningTime="2026-02-27 00:26:32.690910385 +0000 UTC m=+1261.948449939" watchObservedRunningTime="2026-02-27 00:26:32.702591104 +0000 UTC m=+1261.960130658" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.773060 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.794372 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9zkpb-config-9k74f"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825116 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825466 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825499 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825506 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825517 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825524 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825534 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825539 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825549 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825555 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825569 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825587 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825593 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825603 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.825609 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.825620 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827669 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: E0227 00:26:32.827683 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827691 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827898 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827914 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827935 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827945 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="24adb929-f812-4243-94ea-23345856d28f" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827956 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8806487-486f-464d-8249-b6368daabff5" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827967 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827976 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" containerName="mariadb-database-create" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.827992 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828001 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" containerName="mariadb-account-create-update" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828011 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" containerName="ovn-config" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.828596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.831759 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843086 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843232 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.843307 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.871758 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.944596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.944893 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948316 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948741 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.948803 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.949721 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.952742 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:32 crc kubenswrapper[4781]: I0227 00:26:32.964322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"ovn-controller-9zkpb-config-rxxl5\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.024766 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.328522 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119c35de-5e7a-4d3f-af8a-3595d7dc69aa" path="/var/lib/kubelet/pods/119c35de-5e7a-4d3f-af8a-3595d7dc69aa/volumes" Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.586700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:33 crc kubenswrapper[4781]: W0227 00:26:33.593798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5d8eab_34ec_499a_9d69_068f5fb7d9ad.slice/crio-aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb WatchSource:0}: Error finding container aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb: Status 404 returned error can't find the container with id aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.691920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"67c3b0c45a47fddfb6a0951ff71913409afedbf8d25ee2bb5323d4bbb8b32af1"} Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.691964 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"d12ec22021b20d4ca12664a4251617c4b740ddc1110cc6c583285cc7c7efa3da"} Feb 27 00:26:33 crc kubenswrapper[4781]: I0227 00:26:33.696367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerStarted","Data":"aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.713918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"df80b1a2f30812e281285d79ab8c9e2883ed993d5dfaf62c28bd45bbabd723ce"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.714537 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"325759443b43a962464bf46dcc5ca5259566a08633a98d0057d55db65af383d2"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.718105 4781 generic.go:334] "Generic (PLEG): container finished" podID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerID="e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a" exitCode=0 Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.718205 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerDied","Data":"e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a"} Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.721274 4781 generic.go:334] "Generic (PLEG): container finished" podID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerID="da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa" exitCode=0 Feb 27 00:26:34 crc kubenswrapper[4781]: I0227 00:26:34.721315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerDied","Data":"da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.735597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"de3fe0551c7c98ccf5400a775dd2816f8af862a74d043466dc97b6d801196637"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.735894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"5298b3fdc2f8074231037b5c09ecc03350f0efc960213efc6b6884e137f198cf"} Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.742283 4781 generic.go:334] "Generic (PLEG): container finished" podID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerID="69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc" exitCode=0 Feb 27 00:26:35 crc kubenswrapper[4781]: I0227 00:26:35.742556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerDied","Data":"69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.055166 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.095867 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.099058 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241318 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241392 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241557 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.241614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") pod \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\" (UID: \"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.242408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.242922 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243111 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run" (OuterVolumeSpecName: "var-run") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.243975 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts" (OuterVolumeSpecName: "scripts") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.247814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267" (OuterVolumeSpecName: "kube-api-access-42267") pod "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" (UID: "4e5d8eab-34ec-499a-9d69-068f5fb7d9ad"). InnerVolumeSpecName "kube-api-access-42267". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.343872 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344167 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42267\" (UniqueName: \"kubernetes.io/projected/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-kube-api-access-42267\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344185 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344197 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344234 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.344244 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.356323 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548468 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548645 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.548672 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") pod \"47cc3f01-6a5c-4797-bf86-25770e66e928\" (UID: \"47cc3f01-6a5c-4797-bf86-25770e66e928\") " Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.558825 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc" (OuterVolumeSpecName: "kube-api-access-gsrkc") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "kube-api-access-gsrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.559460 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.579292 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.609115 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data" (OuterVolumeSpecName: "config-data") pod "47cc3f01-6a5c-4797-bf86-25770e66e928" (UID: "47cc3f01-6a5c-4797-bf86-25770e66e928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.650859 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.650976 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.651063 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsrkc\" (UniqueName: \"kubernetes.io/projected/47cc3f01-6a5c-4797-bf86-25770e66e928-kube-api-access-gsrkc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.651146 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cc3f01-6a5c-4797-bf86-25770e66e928-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9zkpb-config-rxxl5" event={"ID":"4e5d8eab-34ec-499a-9d69-068f5fb7d9ad","Type":"ContainerDied","Data":"aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757689 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9zkpb-config-rxxl5" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.757723 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa283a0b96a963d6521b640c88b47d54a5b742d6d8c749f00a556b9bd66c57bb" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"8f47ac743ea10fdb7ecd5ee7aee1a8be95e4cb690f71f8b316e36580af9decfd"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"41227c8db766ee5b37d3103966e5e64ae2d6cc15dd0bd2daecd0a0014c90a62d"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"6b44f38c9eaff116f1e2aeb9c56b1882e9c280eb2ad1d24d93d2a2bcf46057d2"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.764559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"281ce6127d50e533b6ef2c64eeb59c860e5d40b0cbb82b607c6f5880f0452db7"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.767694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8tmft" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.771453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8tmft" event={"ID":"47cc3f01-6a5c-4797-bf86-25770e66e928","Type":"ContainerDied","Data":"c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d"} Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.771770 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c43bdd484887a1ab19b1a74ff7e94493b840e9d2b41b9b9e8c3466f0b78cc88d" Feb 27 00:26:36 crc kubenswrapper[4781]: I0227 00:26:36.772733 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.041380 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.260339 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.284482 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9zkpb-config-rxxl5"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.331124 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" path="/var/lib/kubelet/pods/4e5d8eab-34ec-499a-9d69-068f5fb7d9ad/volumes" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.368330 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:37 crc kubenswrapper[4781]: E0227 00:26:37.369063 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369077 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: E0227 00:26:37.369117 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369124 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369504 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5d8eab-34ec-499a-9d69-068f5fb7d9ad" containerName="ovn-config" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.369528 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" containerName="glance-db-sync" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.373416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.400010 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.411663 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.488992 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.489430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.489568 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") pod \"58b577a3-c234-4968-a8e7-c5e629de47b1\" (UID: \"58b577a3-c234-4968-a8e7-c5e629de47b1\") " Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.490320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.499118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z" (OuterVolumeSpecName: "kube-api-access-29n8z") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "kube-api-access-29n8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.529502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.569544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data" (OuterVolumeSpecName: "config-data") pod "58b577a3-c234-4968-a8e7-c5e629de47b1" (UID: "58b577a3-c234-4968-a8e7-c5e629de47b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592109 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592192 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592526 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592538 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29n8z\" (UniqueName: \"kubernetes.io/projected/58b577a3-c234-4968-a8e7-c5e629de47b1-kube-api-access-29n8z\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.592549 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b577a3-c234-4968-a8e7-c5e629de47b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.593261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.593800 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.595858 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.596048 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.619818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"dnsmasq-dns-5b946c75cc-jzk92\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.718108 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rs9bx" event={"ID":"58b577a3-c234-4968-a8e7-c5e629de47b1","Type":"ContainerDied","Data":"9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52"} Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801046 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rs9bx" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.801062 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adcda9f33abe1db2735efdad7642538c67bab67f10201630c502ea9fc7b9c52" Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.812291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11","Type":"ContainerStarted","Data":"7da95ec092aa8d03f82818fa419cc458bcf6c5915f99840320539283be886091"} Feb 27 00:26:37 crc kubenswrapper[4781]: I0227 00:26:37.862745 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=43.22446746 podStartE2EDuration="52.862726298s" podCreationTimestamp="2026-02-27 00:25:45 +0000 UTC" firstStartedPulling="2026-02-27 00:26:25.48590174 +0000 UTC m=+1254.743441304" lastFinishedPulling="2026-02-27 00:26:35.124160588 +0000 UTC m=+1264.381700142" observedRunningTime="2026-02-27 00:26:37.847752442 +0000 UTC m=+1267.105292016" watchObservedRunningTime="2026-02-27 00:26:37.862726298 +0000 UTC m=+1267.120265852" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.042900 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.069595 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:38 crc kubenswrapper[4781]: E0227 00:26:38.069989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070007 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070220 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" containerName="keystone-db-sync" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.070876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.078488 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.078690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079191 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.079762 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.098682 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104742 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104963 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.104991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.105042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.114681 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.116642 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.145249 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210020 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.210222 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211346 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.211822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.229441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.231141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.246146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.247065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.249978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.260954 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"keystone-bootstrap-fwppv\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.314868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.315747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.316272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.317026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.317534 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.361187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"dnsmasq-dns-784f69c749-m2gmj\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.389409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.433938 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.462767 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.483479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.492505 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.492922 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497530 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5hsdr" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497751 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.497863 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.517930 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.517991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.518139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.522506 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.595647 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.597232 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.602890 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622657 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622809 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622832 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622875 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.622913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.623213 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.632336 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.637852 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.645310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.645686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.663409 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664159 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.664687 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.671516 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.671882 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672099 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672185 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d4ppr" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.672264 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2j295" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.676146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.678250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.678912 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.683833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"cinder-db-sync-9vlp4\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.730651 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.731875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.732859 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.733500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.766532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.766662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.786655 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.794861 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.817988 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.818420 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.819293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"dnsmasq-dns-847c4cc679-82c69\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.833485 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.833561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.840236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.842702 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.851225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.851322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.853286 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.855273 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.857125 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.860524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerStarted","Data":"8ff5a2fc17a6b9e8aabae8319eed0080a626c32c0d363e4d215380fca5df2f7c"} Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.919075 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.946675 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.948723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957285 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957358 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957535 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957841 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957926 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.957985 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.958027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.968195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.971267 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.971685 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.987293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:38 crc kubenswrapper[4781]: I0227 00:26:38.987481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.006153 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"neutron-db-sync-bk54r\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.007329 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.010311 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011502 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.011568 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.018272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"barbican-db-sync-bf4zw\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.018348 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.036366 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.037670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.042806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.044838 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7kxfw" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.045072 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.045324 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.053190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059744 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059882 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059925 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.059948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.060034 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.060058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.068815 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.069311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.069553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.095282 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.100930 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.105162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.107399 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.107967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.115289 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.161939 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162562 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162669 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162770 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162863 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.162936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.163010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.163093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165204 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.165460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.170266 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.179715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.184354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.186274 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.186977 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.187609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.199209 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"dnsmasq-dns-785d8bcb8c-5d6jk\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.224434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.243477 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.245305 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.249293 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4ql2s" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.257412 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271175 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271423 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271448 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271481 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.271512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.272504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.285460 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.286359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.304280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.310497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.319131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.333936 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.355039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"cloudkitty-db-sync-l9w6z\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.357034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.372158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"placement-db-sync-jqsnp\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384371 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.384466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.387167 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.389204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.393850 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.405354 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494188 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.494344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.495168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.495409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.500997 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.501035 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.501703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.504358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.507453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.532356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595684 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595741 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595928 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.595950 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.617414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.698879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699464 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.699697 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.700321 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.703317 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.703351 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.705000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.705443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.707602 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.721360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.743160 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.783315 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.784280 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.847579 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.852461 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.865707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.908853 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerID="f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c" exitCode=0 Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.908917 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerDied","Data":"f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c"} Feb 27 00:26:39 crc kubenswrapper[4781]: I0227 00:26:39.925971 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.049419 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.485824 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.494799 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.542700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.559958 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.774283 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.781366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.790771 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859522 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.859652 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") pod \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\" (UID: \"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0\") " Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.868018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688" (OuterVolumeSpecName: "kube-api-access-9n688") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "kube-api-access-9n688". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.887048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.908796 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.909430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.925464 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config" (OuterVolumeSpecName: "config") pod "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" (UID: "8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930392 4781 generic.go:334] "Generic (PLEG): container finished" podID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerID="0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930453 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerDied","Data":"0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.930479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerStarted","Data":"4ad7a1f1a6be4d60bb1c31fde9beac294118a7abd8ca340a5f209fda9ba451e6"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.933000 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"8523e5974cb6fe577a148d4d77627c86ea1298c44ff6fdd8db602516c249b5d9"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.934178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerStarted","Data":"0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.936401 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerStarted","Data":"9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.936446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerStarted","Data":"f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943366 4781 generic.go:334] "Generic (PLEG): container finished" podID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerID="517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerDied","Data":"517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.943455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerStarted","Data":"be280fe4c5dcd4daf8e49a0f28102aac1f55d36e2411ff42f6fe40cde84b1918"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948756 4781 generic.go:334] "Generic (PLEG): container finished" podID="555d083f-48ec-4cf2-922f-211c99af51be" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" exitCode=0 Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.948823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerStarted","Data":"e585d85b515ebdb2e3ddbc1e6c665f7d63c4b4ae71b72a576899c9774906e6b5"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965367 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n688\" (UniqueName: \"kubernetes.io/projected/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-kube-api-access-9n688\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965655 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965668 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965675 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.965685 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.971931 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerStarted","Data":"6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.982988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" event={"ID":"8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0","Type":"ContainerDied","Data":"8ff5a2fc17a6b9e8aabae8319eed0080a626c32c0d363e4d215380fca5df2f7c"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.983011 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-jzk92" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.983064 4781 scope.go:117] "RemoveContainer" containerID="f941532db51eb7d8f322beb83db6f8252752ff4cb56b4df1b09edb6a1f01a13c" Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.991378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerStarted","Data":"914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4"} Feb 27 00:26:40 crc kubenswrapper[4781]: I0227 00:26:40.991429 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerStarted","Data":"7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39"} Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.026926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerStarted","Data":"77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8"} Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.050460 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bk54r" podStartSLOduration=3.050445003 podStartE2EDuration="3.050445003s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:41.048174823 +0000 UTC m=+1270.305714377" watchObservedRunningTime="2026-02-27 00:26:41.050445003 +0000 UTC m=+1270.307984557" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.118348 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.119748 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fwppv" podStartSLOduration=3.119732473 podStartE2EDuration="3.119732473s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:41.099298173 +0000 UTC m=+1270.356837737" watchObservedRunningTime="2026-02-27 00:26:41.119732473 +0000 UTC m=+1270.377272027" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.176543 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.191109 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-jzk92"] Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.211830 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:41 crc kubenswrapper[4781]: W0227 00:26:41.236118 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1055fd61_f323_4cc6_8109_5096add1af65.slice/crio-0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137 WatchSource:0}: Error finding container 0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137: Status 404 returned error can't find the container with id 0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137 Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.474758 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" path="/var/lib/kubelet/pods/8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0/volumes" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.663313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.692809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.693234 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.703819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf" (OuterVolumeSpecName: "kube-api-access-nrnxf") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "kube-api-access-nrnxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.743470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config" (OuterVolumeSpecName: "config") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.764042 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800487 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800554 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800575 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") pod \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\" (UID: \"8be86e27-4a35-4929-92d1-bfcd0ce641a8\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.800761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") pod \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\" (UID: \"9673a51c-390f-4e38-ae85-e5c3e1eaa816\") " Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.801310 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.801339 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnxf\" (UniqueName: \"kubernetes.io/projected/8be86e27-4a35-4929-92d1-bfcd0ce641a8-kube-api-access-nrnxf\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.830415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95" (OuterVolumeSpecName: "kube-api-access-bst95") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "kube-api-access-bst95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:41 crc kubenswrapper[4781]: I0227 00:26:41.910424 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bst95\" (UniqueName: \"kubernetes.io/projected/9673a51c-390f-4e38-ae85-e5c3e1eaa816-kube-api-access-bst95\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.058826 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.084495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-82c69" event={"ID":"9673a51c-390f-4e38-ae85-e5c3e1eaa816","Type":"ContainerDied","Data":"be280fe4c5dcd4daf8e49a0f28102aac1f55d36e2411ff42f6fe40cde84b1918"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089877 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-82c69" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.089900 4781 scope.go:117] "RemoveContainer" containerID="517ead0b52aa65ce0d6fa994a34b320f55d9362dc9c894d32144d2082b233fb4" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.103941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerStarted","Data":"6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.112059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.114190 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" event={"ID":"8be86e27-4a35-4929-92d1-bfcd0ce641a8","Type":"ContainerDied","Data":"4ad7a1f1a6be4d60bb1c31fde9beac294118a7abd8ca340a5f209fda9ba451e6"} Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.114439 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-m2gmj" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.116135 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.116277 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.118299 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: W0227 00:26:42.120999 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac31f36e_35c4_4f48_a05b_f49855052358.slice/crio-084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15 WatchSource:0}: Error finding container 084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15: Status 404 returned error can't find the container with id 084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15 Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.138160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.141766 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config" (OuterVolumeSpecName: "config") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.141876 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.144500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8be86e27-4a35-4929-92d1-bfcd0ce641a8" (UID: "8be86e27-4a35-4929-92d1-bfcd0ce641a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.147097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9673a51c-390f-4e38-ae85-e5c3e1eaa816" (UID: "9673a51c-390f-4e38-ae85-e5c3e1eaa816"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.199353 4781 scope.go:117] "RemoveContainer" containerID="0f7d06bdf37788cb17229ac4656b5472f2e83b9e00dc3b57906aa80af5c573a4" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229329 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229361 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229371 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229380 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229388 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9673a51c-390f-4e38-ae85-e5c3e1eaa816-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.229396 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8be86e27-4a35-4929-92d1-bfcd0ce641a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.541036 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.558789 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-82c69"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.592443 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.614903 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-m2gmj"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.892757 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.895788 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.895833 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.951541 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:26:42 crc kubenswrapper[4781]: I0227 00:26:42.977641 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.137694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.140326 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerStarted","Data":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.141892 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.152093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.207205 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" podStartSLOduration=5.207184671 podStartE2EDuration="5.207184671s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:43.180895086 +0000 UTC m=+1272.438434660" watchObservedRunningTime="2026-02-27 00:26:43.207184671 +0000 UTC m=+1272.464724235" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.343173 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" path="/var/lib/kubelet/pods/8be86e27-4a35-4929-92d1-bfcd0ce641a8/volumes" Feb 27 00:26:43 crc kubenswrapper[4781]: I0227 00:26:43.343886 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" path="/var/lib/kubelet/pods/9673a51c-390f-4e38-ae85-e5c3e1eaa816/volumes" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.181120 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8"} Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerStarted","Data":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185702 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" containerID="cri-o://a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" gracePeriod=30 Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.185783 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" containerID="cri-o://4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" gracePeriod=30 Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.211048 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.211029321 podStartE2EDuration="6.211029321s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:44.206576573 +0000 UTC m=+1273.464116117" watchObservedRunningTime="2026-02-27 00:26:44.211029321 +0000 UTC m=+1273.468568875" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.928988 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994648 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.994940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"1055fd61-f323-4cc6-8109-5096add1af65\" (UID: \"1055fd61-f323-4cc6-8109-5096add1af65\") " Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.995668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:44 crc kubenswrapper[4781]: I0227 00:26:44.995971 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs" (OuterVolumeSpecName: "logs") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.009300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq" (OuterVolumeSpecName: "kube-api-access-hxqvq") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "kube-api-access-hxqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.016846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts" (OuterVolumeSpecName: "scripts") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.034478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.059725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098932 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxqvq\" (UniqueName: \"kubernetes.io/projected/1055fd61-f323-4cc6-8109-5096add1af65-kube-api-access-hxqvq\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098964 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098973 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098983 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.098993 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1055fd61-f323-4cc6-8109-5096add1af65-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.099026 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.106923 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data" (OuterVolumeSpecName: "config-data") pod "1055fd61-f323-4cc6-8109-5096add1af65" (UID: "1055fd61-f323-4cc6-8109-5096add1af65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.127340 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.127516 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.201710 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.201771 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1055fd61-f323-4cc6-8109-5096add1af65-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerStarted","Data":"2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206567 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" containerID="cri-o://af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" gracePeriod=30 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.206669 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" containerID="cri-o://2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" gracePeriod=30 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211311 4781 generic.go:334] "Generic (PLEG): container finished" podID="1055fd61-f323-4cc6-8109-5096add1af65" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" exitCode=143 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211343 4781 generic.go:334] "Generic (PLEG): container finished" podID="1055fd61-f323-4cc6-8109-5096add1af65" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" exitCode=143 Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211386 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1055fd61-f323-4cc6-8109-5096add1af65","Type":"ContainerDied","Data":"0a4dbcea2445e2dfe290d51e82f770fb1c0297a9dd9fce44fbc8991c778b4137"} Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.211473 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.242453 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.242427699 podStartE2EDuration="7.242427699s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:26:45.233030141 +0000 UTC m=+1274.490569695" watchObservedRunningTime="2026-02-27 00:26:45.242427699 +0000 UTC m=+1274.499967253" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.299558 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.306857 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.348150 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1055fd61-f323-4cc6-8109-5096add1af65" path="/var/lib/kubelet/pods/1055fd61-f323-4cc6-8109-5096add1af65/volumes" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.355768 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356317 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356343 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356364 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356375 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356406 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356414 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356437 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356446 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: E0227 00:26:45.356458 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356466 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356707 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be86e27-4a35-4929-92d1-bfcd0ce641a8" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356744 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-httpd" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356764 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5ea5d5-cb8b-40d5-98fa-6af50f975ef0" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356781 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9673a51c-390f-4e38-ae85-e5c3e1eaa816" containerName="init" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.356794 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1055fd61-f323-4cc6-8109-5096add1af65" containerName="glance-log" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.358544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.361993 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.380335 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507288 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507494 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.507557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609431 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609531 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.609773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.610674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.610757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.614512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.615317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.615697 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.616654 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.616821 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.638540 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.710958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " pod="openstack/glance-default-external-api-0" Feb 27 00:26:45 crc kubenswrapper[4781]: I0227 00:26:45.984979 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.228116 4781 generic.go:334] "Generic (PLEG): container finished" podID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerID="914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4" exitCode=0 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.228209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerDied","Data":"914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4"} Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233849 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac31f36e-35c4-4f48-a05b-f49855052358" containerID="2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" exitCode=0 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233893 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac31f36e-35c4-4f48-a05b-f49855052358" containerID="af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" exitCode=143 Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1"} Feb 27 00:26:46 crc kubenswrapper[4781]: I0227 00:26:46.233989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8"} Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.016879 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060299 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060428 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060492 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.060570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") pod \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\" (UID: \"75b432e5-2a1d-421d-ac63-202bbe4be5c5\") " Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.069478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.069667 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6" (OuterVolumeSpecName: "kube-api-access-pdcl6") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "kube-api-access-pdcl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.083892 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.083981 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts" (OuterVolumeSpecName: "scripts") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.105879 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data" (OuterVolumeSpecName: "config-data") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.135119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75b432e5-2a1d-421d-ac63-202bbe4be5c5" (UID: "75b432e5-2a1d-421d-ac63-202bbe4be5c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163486 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163529 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163541 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163552 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163562 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75b432e5-2a1d-421d-ac63-202bbe4be5c5-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.163573 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcl6\" (UniqueName: \"kubernetes.io/projected/75b432e5-2a1d-421d-ac63-202bbe4be5c5-kube-api-access-pdcl6\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258170 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fwppv" event={"ID":"75b432e5-2a1d-421d-ac63-202bbe4be5c5","Type":"ContainerDied","Data":"7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39"} Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258219 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7610fc60d1158180f6a0fcb6c59bb1930e7a3d6fd2c319da87c248ff0413eb39" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.258287 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fwppv" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.361636 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.370950 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fwppv"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.471686 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:26:48 crc kubenswrapper[4781]: E0227 00:26:48.472154 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472171 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472347 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" containerName="keystone-bootstrap" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.472999 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478443 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478497 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478525 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.478861 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.481433 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573145 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.573619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678297 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.678556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.683303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.683849 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.684266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.684583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.686666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.698557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"keystone-bootstrap-gxj6b\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725596 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725884 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" containerID="cri-o://490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.725948 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" containerID="cri-o://0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.726036 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" containerID="cri-o://c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" gracePeriod=600 Feb 27 00:26:48 crc kubenswrapper[4781]: I0227 00:26:48.802656 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.225769 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302886 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302925 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302943 4781 generic.go:334] "Generic (PLEG): container finished" podID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerID="490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" exitCode=0 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.302966 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.303002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.303013 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f"} Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.356754 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b432e5-2a1d-421d-ac63-202bbe4be5c5" path="/var/lib/kubelet/pods/75b432e5-2a1d-421d-ac63-202bbe4be5c5/volumes" Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.357352 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.357559 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" containerID="cri-o://e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" gracePeriod=10 Feb 27 00:26:49 crc kubenswrapper[4781]: I0227 00:26:49.766684 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.315451 4781 generic.go:334] "Generic (PLEG): container finished" podID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerID="e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" exitCode=0 Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.315488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5"} Feb 27 00:26:50 crc kubenswrapper[4781]: I0227 00:26:50.483951 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 27 00:26:51 crc kubenswrapper[4781]: I0227 00:26:51.096036 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": dial tcp 10.217.0.116:9090: connect: connection refused" Feb 27 00:26:54 crc kubenswrapper[4781]: I0227 00:26:54.909958 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.484983 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.488273 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.488548 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmp94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bf4zw_openstack(314ca901-3264-4136-b377-daad0075b72c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:26:55 crc kubenswrapper[4781]: E0227 00:26:55.489620 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bf4zw" podUID="314ca901-3264-4136-b377-daad0075b72c" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.591686 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629600 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629655 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.629785 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"ac31f36e-35c4-4f48-a05b-f49855052358\" (UID: \"ac31f36e-35c4-4f48-a05b-f49855052358\") " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs" (OuterVolumeSpecName: "logs") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630681 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.630701 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac31f36e-35c4-4f48-a05b-f49855052358-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.633404 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts" (OuterVolumeSpecName: "scripts") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.638830 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497" (OuterVolumeSpecName: "kube-api-access-wt497") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "kube-api-access-wt497". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.645044 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (OuterVolumeSpecName: "glance") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.659856 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.688039 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data" (OuterVolumeSpecName: "config-data") pod "ac31f36e-35c4-4f48-a05b-f49855052358" (UID: "ac31f36e-35c4-4f48-a05b-f49855052358"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.732651 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733010 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733020 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt497\" (UniqueName: \"kubernetes.io/projected/ac31f36e-35c4-4f48-a05b-f49855052358-kube-api-access-wt497\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733030 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac31f36e-35c4-4f48-a05b-f49855052358-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.733067 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" " Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.766000 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.766152 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549") on node "crc" Feb 27 00:26:55 crc kubenswrapper[4781]: I0227 00:26:55.835228 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" DevicePath \"\"" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.387345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ac31f36e-35c4-4f48-a05b-f49855052358","Type":"ContainerDied","Data":"084b441c862a06da9a9c0b4c64f7175017dfb66c7fcb6d7f1ae1871790335c15"} Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.387368 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.390504 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-bf4zw" podUID="314ca901-3264-4136-b377-daad0075b72c" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.450285 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.462047 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471369 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.471875 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471898 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: E0227 00:26:56.471931 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.471941 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.472180 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-log" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.472214 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" containerName="glance-httpd" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.473485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.476658 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.478188 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.490774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.652554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653253 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653434 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653464 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.653514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.754987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755060 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.755610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.757612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.759913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.760049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.760707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.761254 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.761291 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.764242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.838043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:56 crc kubenswrapper[4781]: I0227 00:26:56.859283 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:26:57 crc kubenswrapper[4781]: I0227 00:26:57.126813 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:26:57 crc kubenswrapper[4781]: I0227 00:26:57.320565 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac31f36e-35c4-4f48-a05b-f49855052358" path="/var/lib/kubelet/pods/ac31f36e-35c4-4f48-a05b-f49855052358/volumes" Feb 27 00:26:59 crc kubenswrapper[4781]: I0227 00:26:59.095923 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:27:04 crc kubenswrapper[4781]: I0227 00:27:04.107749 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.116:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:27:04 crc kubenswrapper[4781]: I0227 00:27:04.109748 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:05 crc kubenswrapper[4781]: E0227 00:27:05.344939 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 27 00:27:05 crc kubenswrapper[4781]: E0227 00:27:05.346182 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cdh5ffh577h57dh66ch59h8chffh65ch575h67ch5b9hfbh544h9bh58ch64h696h95h67fh5dh64bh58fh665h64h8bh9h649h54ch565h5bdh66fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c607f0bd-ab23-4fc5-8aa7-437be5e6d59d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.484554 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.486695 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.495516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1f85c54b-b800-429a-ba2d-fe22056ac907","Type":"ContainerDied","Data":"2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8"} Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.495564 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad75abe5f1e9859dec62d9d7e1f0e4f7552fc881d371d2d01763329d31bdef8" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.498370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc44h" event={"ID":"8e37b0a7-69ac-439e-9c5a-207210fe40c8","Type":"ContainerDied","Data":"e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b"} Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.498402 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1839c0058f09c92d633d8b44bcde9496faf128970e2a8993b81a296f21aac5b" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.581161 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.596718 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646434 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646639 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.646741 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") pod \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\" (UID: \"8e37b0a7-69ac-439e-9c5a-207210fe40c8\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.681741 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b" (OuterVolumeSpecName: "kube-api-access-8jz5b") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "kube-api-access-8jz5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.714596 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.740170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config" (OuterVolumeSpecName: "config") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.741242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748152 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748278 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748322 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748517 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748578 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748659 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748716 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.748798 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") pod \"1f85c54b-b800-429a-ba2d-fe22056ac907\" (UID: \"1f85c54b-b800-429a-ba2d-fe22056ac907\") " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749283 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749306 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749318 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jz5b\" (UniqueName: \"kubernetes.io/projected/8e37b0a7-69ac-439e-9c5a-207210fe40c8-kube-api-access-8jz5b\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749341 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749838 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.749871 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.753246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out" (OuterVolumeSpecName: "config-out") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.756862 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.759081 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l" (OuterVolumeSpecName: "kube-api-access-2945l") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "kube-api-access-2945l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.759864 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.762235 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config" (OuterVolumeSpecName: "config") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.778183 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config" (OuterVolumeSpecName: "web-config") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.790351 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1f85c54b-b800-429a-ba2d-fe22056ac907" (UID: "1f85c54b-b800-429a-ba2d-fe22056ac907"). InnerVolumeSpecName "pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.811334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e37b0a7-69ac-439e-9c5a-207210fe40c8" (UID: "8e37b0a7-69ac-439e-9c5a-207210fe40c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.850966 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2945l\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-kube-api-access-2945l\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851021 4781 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851038 4781 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f85c54b-b800-429a-ba2d-fe22056ac907-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851053 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851068 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851080 4781 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1f85c54b-b800-429a-ba2d-fe22056ac907-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851090 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e37b0a7-69ac-439e-9c5a-207210fe40c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851101 4781 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f85c54b-b800-429a-ba2d-fe22056ac907-web-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851112 4781 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f85c54b-b800-429a-ba2d-fe22056ac907-config-out\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.851156 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") on node \"crc\" " Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.881034 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.881483 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6") on node "crc" Feb 27 00:27:05 crc kubenswrapper[4781]: I0227 00:27:05.952959 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.038835 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511506 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerID="9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58" exitCode=0 Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511668 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511731 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc44h" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.511578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerDied","Data":"9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58"} Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.561718 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.573369 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc44h"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.585752 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.596183 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608380 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608795 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="init" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608822 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="init" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608846 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608853 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608863 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="init-config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608870 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="init-config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608877 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608883 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608897 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608904 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: E0227 00:27:06.608918 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.608923 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609082 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="config-reloader" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609092 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609110 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="prometheus" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.609120 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" containerName="thanos-sidecar" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.610739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.613855 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614022 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614131 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614373 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.614680 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.615159 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zmzb4" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.615257 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.617194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.621292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668902 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.668999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669385 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669401 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669424 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669568 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.669703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771571 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771633 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771768 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.771789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.773304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.773876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.776088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777499 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777541 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b26095f48a6799aae7472dc34ad76c7f8559a3fa84033df1f18203d2595242ed/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.777993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.778518 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.778660 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.780235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.784045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.784813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.790298 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-config\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.790745 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.797643 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll7tb\" (UniqueName: \"kubernetes.io/projected/85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f-kube-api-access-ll7tb\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.820448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f2e07a89-2544-43e3-9732-e2b3ebf3b9f6\") pod \"prometheus-metric-storage-0\" (UID: \"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f\") " pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:06 crc kubenswrapper[4781]: I0227 00:27:06.969034 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.271535 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.271719 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvkqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9vlp4_openstack(aef65495-ecb2-4396-bb05-a4c5ee48f291): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.272841 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9vlp4" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" Feb 27 00:27:07 crc kubenswrapper[4781]: I0227 00:27:07.320391 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f85c54b-b800-429a-ba2d-fe22056ac907" path="/var/lib/kubelet/pods/1f85c54b-b800-429a-ba2d-fe22056ac907/volumes" Feb 27 00:27:07 crc kubenswrapper[4781]: I0227 00:27:07.321259 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" path="/var/lib/kubelet/pods/8e37b0a7-69ac-439e-9c5a-207210fe40c8/volumes" Feb 27 00:27:07 crc kubenswrapper[4781]: E0227 00:27:07.525804 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9vlp4" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" Feb 27 00:27:10 crc kubenswrapper[4781]: I0227 00:27:10.486888 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc44h" podUID="8e37b0a7-69ac-439e-9c5a-207210fe40c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.274720 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: E0227 00:27:11.275492 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275519 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} err="failed to get container status \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275542 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: E0227 00:27:11.275937 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275956 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} err="failed to get container status \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.275970 4781 scope.go:117] "RemoveContainer" containerID="4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276255 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e"} err="failed to get container status \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": rpc error: code = NotFound desc = could not find container \"4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e\": container with ID starting with 4baba8cd9b434fd2eeb1b9fd32bb1afc0b648fbd81e0b34b4701d41d322ffe7e not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276279 4781 scope.go:117] "RemoveContainer" containerID="a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276541 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318"} err="failed to get container status \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": rpc error: code = NotFound desc = could not find container \"a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318\": container with ID starting with a64eaa5dc3f9214454b1c2c8f2206da3278a11327bde4d2d975c26b50904d318 not found: ID does not exist" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.276566 4781 scope.go:117] "RemoveContainer" containerID="2a9c0dd16ab1b03b070571c17e79b75425373239b1c45b1c5b15a3b9a4d8f4b1" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.430255 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456332 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.456532 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") pod \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\" (UID: \"3f43ab5c-f862-468c-92c1-ec7366eb7ed0\") " Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.467451 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9" (OuterVolumeSpecName: "kube-api-access-lzkc9") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "kube-api-access-lzkc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.500173 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config" (OuterVolumeSpecName: "config") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.500813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f43ab5c-f862-468c-92c1-ec7366eb7ed0" (UID: "3f43ab5c-f862-468c-92c1-ec7366eb7ed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.556444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"5275ce5209350a6beca9364d6baa1757ba1b2bb302e2e2d5d8f5780ac3a4ca75"} Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558314 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzkc9\" (UniqueName: \"kubernetes.io/projected/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-kube-api-access-lzkc9\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558342 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.558356 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f43ab5c-f862-468c-92c1-ec7366eb7ed0-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bk54r" event={"ID":"3f43ab5c-f862-468c-92c1-ec7366eb7ed0","Type":"ContainerDied","Data":"f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73"} Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560261 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4907a514c133717f2dd463877fdc9d6b4b6535ee45f8865a1b93ba48242fe73" Feb 27 00:27:11 crc kubenswrapper[4781]: I0227 00:27:11.560285 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bk54r" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.203735 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.282206 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:12 crc kubenswrapper[4781]: E0227 00:27:12.716715 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716728 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.716903 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" containerName="neutron-db-sync" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.721145 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.748726 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.765551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799495 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.799555 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.808164 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.810063 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821174 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821493 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821719 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d4ppr" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.821858 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.827412 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.897901 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.897959 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.901709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.902763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.902783 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.903310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.903354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.908410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:12 crc kubenswrapper[4781]: I0227 00:27:12.939196 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dnsmasq-dns-55f844cf75-bp4v9\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.002937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.003212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.008434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.009690 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.013337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.022410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.025256 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"neutron-5445c56cbd-fmcjz\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.065334 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:13 crc kubenswrapper[4781]: I0227 00:27:13.136653 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.696379 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.701075 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.703358 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.703969 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.718097 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742477 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742522 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742688 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.742709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845665 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845714 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845833 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.845892 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.852126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.853769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.854025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.855061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.860183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.863615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:14 crc kubenswrapper[4781]: I0227 00:27:14.863989 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"neutron-5b48494fc7-447pr\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:15 crc kubenswrapper[4781]: I0227 00:27:15.025737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:19 crc kubenswrapper[4781]: I0227 00:27:19.903470 4781 scope.go:117] "RemoveContainer" containerID="af130dc6503472cd229a6073407a944eb9f345fe510e3a5882815f2cc79c8dc8" Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.640415 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"7d0ca3340d609e18433fc291df1d484624d9e133542d96a4dff1a09c6cf6905a"} Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.643858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerStarted","Data":"cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d"} Feb 27 00:27:20 crc kubenswrapper[4781]: I0227 00:27:20.645057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"07e4f346c30153beb2d7f86fae70b693d729de2b22f5a27f4024b9039dd8a05a"} Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.769543 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.770115 4781 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.770248 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwsv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-l9w6z_openstack(2274af64-0743-4ede-8fb8-e2ed801638ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 00:27:23 crc kubenswrapper[4781]: E0227 00:27:23.771454 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-l9w6z" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.314252 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.528272 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.603690 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:24 crc kubenswrapper[4781]: W0227 00:27:24.639832 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c WatchSource:0}: Error finding container 9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c: Status 404 returned error can't find the container with id 9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.712910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.712968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"35abaddf64ded29044d57543bd49dba6fb7cc622e405ec56e6449b1f79234b7a"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.715083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerStarted","Data":"3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.718153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.721960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.726837 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerStarted","Data":"d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.737924 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jqsnp" podStartSLOduration=32.429275189 podStartE2EDuration="46.737903694s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:41.17601959 +0000 UTC m=+1270.433559144" lastFinishedPulling="2026-02-27 00:26:55.484648055 +0000 UTC m=+1284.742187649" observedRunningTime="2026-02-27 00:27:24.733838306 +0000 UTC m=+1313.991377860" watchObservedRunningTime="2026-02-27 00:27:24.737903694 +0000 UTC m=+1313.995443248" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.752887 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerStarted","Data":"89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c"} Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.753068 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gxj6b" podStartSLOduration=36.753058564 podStartE2EDuration="36.753058564s" podCreationTimestamp="2026-02-27 00:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:24.752111899 +0000 UTC m=+1314.009651453" watchObservedRunningTime="2026-02-27 00:27:24.753058564 +0000 UTC m=+1314.010598118" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.760264 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerStarted","Data":"9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c"} Feb 27 00:27:24 crc kubenswrapper[4781]: E0227 00:27:24.761512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-l9w6z" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" Feb 27 00:27:24 crc kubenswrapper[4781]: I0227 00:27:24.776764 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bf4zw" podStartSLOduration=3.421811665 podStartE2EDuration="46.77674243s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.582292875 +0000 UTC m=+1269.839832429" lastFinishedPulling="2026-02-27 00:27:23.93722364 +0000 UTC m=+1313.194763194" observedRunningTime="2026-02-27 00:27:24.772031935 +0000 UTC m=+1314.029571489" watchObservedRunningTime="2026-02-27 00:27:24.77674243 +0000 UTC m=+1314.034282004" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.772214 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.773959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.776974 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.777018 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerStarted","Data":"994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.777056 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.780351 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerStarted","Data":"7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.781911 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" exitCode=0 Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.781962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.786229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerStarted","Data":"24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89"} Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.786607 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.806737 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b48494fc7-447pr" podStartSLOduration=11.806714499 podStartE2EDuration="11.806714499s" podCreationTimestamp="2026-02-27 00:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:25.803377151 +0000 UTC m=+1315.060916705" watchObservedRunningTime="2026-02-27 00:27:25.806714499 +0000 UTC m=+1315.064254053" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.839618 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5445c56cbd-fmcjz" podStartSLOduration=13.839594868 podStartE2EDuration="13.839594868s" podCreationTimestamp="2026-02-27 00:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:25.824245282 +0000 UTC m=+1315.081784836" watchObservedRunningTime="2026-02-27 00:27:25.839594868 +0000 UTC m=+1315.097134422" Feb 27 00:27:25 crc kubenswrapper[4781]: I0227 00:27:25.863268 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9vlp4" podStartSLOduration=4.343900944 podStartE2EDuration="47.863248843s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.514584526 +0000 UTC m=+1269.772124080" lastFinishedPulling="2026-02-27 00:27:24.033932425 +0000 UTC m=+1313.291471979" observedRunningTime="2026-02-27 00:27:25.854482671 +0000 UTC m=+1315.112022225" watchObservedRunningTime="2026-02-27 00:27:25.863248843 +0000 UTC m=+1315.120788397" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerStarted","Data":"19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802370 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" containerID="cri-o://6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" gracePeriod=30 Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.802660 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" containerID="cri-o://19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" gracePeriod=30 Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.810643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerStarted","Data":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.811555 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.816160 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerStarted","Data":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.839032 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=41.839013122 podStartE2EDuration="41.839013122s" podCreationTimestamp="2026-02-27 00:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.824809736 +0000 UTC m=+1316.082349310" watchObservedRunningTime="2026-02-27 00:27:26.839013122 +0000 UTC m=+1316.096552676" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.846101 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" podStartSLOduration=14.846083848 podStartE2EDuration="14.846083848s" podCreationTimestamp="2026-02-27 00:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.844775914 +0000 UTC m=+1316.102315468" watchObservedRunningTime="2026-02-27 00:27:26.846083848 +0000 UTC m=+1316.103623402" Feb 27 00:27:26 crc kubenswrapper[4781]: I0227 00:27:26.863497 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.863478868 podStartE2EDuration="30.863478868s" podCreationTimestamp="2026-02-27 00:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:26.860276603 +0000 UTC m=+1316.117816157" watchObservedRunningTime="2026-02-27 00:27:26.863478868 +0000 UTC m=+1316.121018422" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127704 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127761 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.127922 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.230430 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.233090 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.852920 4781 generic.go:334] "Generic (PLEG): container finished" podID="fe8b7774-c640-416d-82a4-535fee88a47b" containerID="19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" exitCode=0 Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853176 4781 generic.go:334] "Generic (PLEG): container finished" podID="fe8b7774-c640-416d-82a4-535fee88a47b" containerID="6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" exitCode=143 Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0"} Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.853268 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce"} Feb 27 00:27:27 crc kubenswrapper[4781]: I0227 00:27:27.890820 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf"} Feb 27 00:27:28 crc kubenswrapper[4781]: I0227 00:27:28.916131 4781 generic.go:334] "Generic (PLEG): container finished" podID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerID="d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe" exitCode=0 Feb 27 00:27:28 crc kubenswrapper[4781]: I0227 00:27:28.917025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerDied","Data":"d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.177657 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268903 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.268970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269114 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.269154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") pod \"fe8b7774-c640-416d-82a4-535fee88a47b\" (UID: \"fe8b7774-c640-416d-82a4-535fee88a47b\") " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.270694 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs" (OuterVolumeSpecName: "logs") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.270965 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.281013 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl" (OuterVolumeSpecName: "kube-api-access-8vxcl") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "kube-api-access-8vxcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.287936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts" (OuterVolumeSpecName: "scripts") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.288908 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.303668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.320518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data" (OuterVolumeSpecName: "config-data") pod "fe8b7774-c640-416d-82a4-535fee88a47b" (UID: "fe8b7774-c640-416d-82a4-535fee88a47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371814 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371854 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371871 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371880 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/fe8b7774-c640-416d-82a4-535fee88a47b-kube-api-access-8vxcl\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371910 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371924 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8b7774-c640-416d-82a4-535fee88a47b-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.371933 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8b7774-c640-416d-82a4-535fee88a47b-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.408650 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.409146 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.474528 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.929719 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerID="3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958" exitCode=0 Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.929789 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerDied","Data":"3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934753 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8b7774-c640-416d-82a4-535fee88a47b","Type":"ContainerDied","Data":"5275ce5209350a6beca9364d6baa1757ba1b2bb302e2e2d5d8f5780ac3a4ca75"} Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.934854 4781 scope.go:117] "RemoveContainer" containerID="19709bfd53c352e0c185741ee4fec3ea205eb88bb82c12ed0931a9d4525701e0" Feb 27 00:27:29 crc kubenswrapper[4781]: I0227 00:27:29.998130 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.016753 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.028930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: E0227 00:27:30.029352 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029368 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: E0227 00:27:30.029384 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029391 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029584 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-httpd" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.029601 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" containerName="glance-log" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.030660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.033123 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.040272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.045018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.185970 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.186362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287536 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287616 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287710 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.287846 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.289379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.291601 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.291637 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.292728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.296446 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.299140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.306214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.306739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.307307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.341337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.356346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.948367 4781 generic.go:334] "Generic (PLEG): container finished" podID="314ca901-3264-4136-b377-daad0075b72c" containerID="89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c" exitCode=0 Feb 27 00:27:30 crc kubenswrapper[4781]: I0227 00:27:30.948405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerDied","Data":"89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.333803 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8b7774-c640-416d-82a4-535fee88a47b" path="/var/lib/kubelet/pods/fe8b7774-c640-416d-82a4-535fee88a47b/volumes" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.419918 4781 scope.go:117] "RemoveContainer" containerID="6b3c74fbdf3287fb50b684f9cb5119d529b541e265cc750ba76a4e2cbc2b36ce" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.636959 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.642870 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719221 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719442 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719472 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") pod \"a3fa4251-dd48-417b-8002-6df02d3d3dac\" (UID: \"a3fa4251-dd48-417b-8002-6df02d3d3dac\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719500 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.719538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") pod \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\" (UID: \"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1\") " Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.720518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs" (OuterVolumeSpecName: "logs") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.728143 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp" (OuterVolumeSpecName: "kube-api-access-pwdbp") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "kube-api-access-pwdbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.730811 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts" (OuterVolumeSpecName: "scripts") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.733019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.733712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts" (OuterVolumeSpecName: "scripts") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.734598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk" (OuterVolumeSpecName: "kube-api-access-jh2sk") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "kube-api-access-jh2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.736873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.755916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data" (OuterVolumeSpecName: "config-data") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.756755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.757905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3fa4251-dd48-417b-8002-6df02d3d3dac" (UID: "a3fa4251-dd48-417b-8002-6df02d3d3dac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.759186 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data" (OuterVolumeSpecName: "config-data") pod "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" (UID: "b0e6d9a1-4cb3-443f-8a81-32d16c4051b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821788 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821845 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821869 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821890 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821907 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821922 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821936 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh2sk\" (UniqueName: \"kubernetes.io/projected/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1-kube-api-access-jh2sk\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821969 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3fa4251-dd48-417b-8002-6df02d3d3dac-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.821986 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdbp\" (UniqueName: \"kubernetes.io/projected/a3fa4251-dd48-417b-8002-6df02d3d3dac-kube-api-access-pwdbp\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.822000 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3fa4251-dd48-417b-8002-6df02d3d3dac-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.963865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqsnp" event={"ID":"a3fa4251-dd48-417b-8002-6df02d3d3dac","Type":"ContainerDied","Data":"6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.965530 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da65166fa2a15c764d849696d3e6b0686802ef8180c50248f4b03677850887a" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.964201 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqsnp" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.968574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.978774 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gxj6b" Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.980035 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gxj6b" event={"ID":"b0e6d9a1-4cb3-443f-8a81-32d16c4051b1","Type":"ContainerDied","Data":"cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d"} Feb 27 00:27:31 crc kubenswrapper[4781]: I0227 00:27:31.980080 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1310454f14cbb6fa301146043553cf2eabbe6f919a1570a19e8768d9fd1b5d" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.031396 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:27:32 crc kubenswrapper[4781]: W0227 00:27:32.074948 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb47b6b2_760a_4899_84f6_fdf1bd62a418.slice/crio-866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d WatchSource:0}: Error finding container 866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d: Status 404 returned error can't find the container with id 866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.139493 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.140014 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140036 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.140068 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140078 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140421 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" containerName="keystone-bootstrap" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.140472 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" containerName="placement-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.142000 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150268 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150610 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.150844 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.151052 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7kxfw" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.152144 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.153968 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232432 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232512 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.232723 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.335881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.339761 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.342812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.344911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.345653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.354470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"placement-76c479bbf8-lkpd7\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437583 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437884 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.437950 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") pod \"314ca901-3264-4136-b377-daad0075b72c\" (UID: \"314ca901-3264-4136-b377-daad0075b72c\") " Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.442725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94" (OuterVolumeSpecName: "kube-api-access-lmp94") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "kube-api-access-lmp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.444691 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.460296 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.476802 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "314ca901-3264-4136-b377-daad0075b72c" (UID: "314ca901-3264-4136-b377-daad0075b72c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540822 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmp94\" (UniqueName: \"kubernetes.io/projected/314ca901-3264-4136-b377-daad0075b72c-kube-api-access-lmp94\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540850 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.540859 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314ca901-3264-4136-b377-daad0075b72c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.887300 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:32 crc kubenswrapper[4781]: E0227 00:27:32.888253 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.888293 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.888602 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="314ca901-3264-4136-b377-daad0075b72c" containerName="barbican-db-sync" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.889987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.896028 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.896278 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nhgp" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897372 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.897608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.901051 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.951580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:32 crc kubenswrapper[4781]: I0227 00:27:32.976205 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.007229 4781 generic.go:334] "Generic (PLEG): container finished" podID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerID="7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c" exitCode=0 Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.007322 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerDied","Data":"7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.011883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.011933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014846 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bf4zw" event={"ID":"314ca901-3264-4136-b377-daad0075b72c","Type":"ContainerDied","Data":"0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee"} Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014864 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dfa44d37d2f64ae96d38dcaa27616ed0a623f908ad45d0066876fbf98be36ee" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.014892 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bf4zw" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053365 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.053576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057258 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-credential-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-scripts\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.057885 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-fernet-keys\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.058656 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-config-data\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.059453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-public-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.059985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-internal-tls-certs\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.061983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2467458a-476f-460f-a6ce-144d7304476d-combined-ca-bundle\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.066811 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.075354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5b7\" (UniqueName: \"kubernetes.io/projected/2467458a-476f-460f-a6ce-144d7304476d-kube-api-access-sx5b7\") pod \"keystone-56459cf68c-4q7c8\" (UID: \"2467458a-476f-460f-a6ce-144d7304476d\") " pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.140415 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.141008 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" containerID="cri-o://4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" gracePeriod=10 Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.236434 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.430708 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.433581 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.433608 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.434343 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.438698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.438779 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440287 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2j295" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440431 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.440551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.444851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.446505 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.452750 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.456638 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572437 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572693 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.572972 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.588807 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.591136 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.593857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.601285 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.675873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676279 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676474 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676916 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.676938 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.677006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.680491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92df023-2e4a-495e-bbef-4a043c661f46-logs\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.680841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.681468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.681804 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41039943-96a7-4fe6-8b66-0d64cd12a1fa-logs\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.682436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.683208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.687387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-combined-ca-bundle\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.688193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.689188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.689277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-combined-ca-bundle\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.690771 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41039943-96a7-4fe6-8b66-0d64cd12a1fa-config-data-custom\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.695840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f92df023-2e4a-495e-bbef-4a043c661f46-config-data-custom\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.697688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldtl\" (UniqueName: \"kubernetes.io/projected/41039943-96a7-4fe6-8b66-0d64cd12a1fa-kube-api-access-tldtl\") pod \"barbican-keystone-listener-6fff4854c8-ttzsm\" (UID: \"41039943-96a7-4fe6-8b66-0d64cd12a1fa\") " pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.704216 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj629\" (UniqueName: \"kubernetes.io/projected/f92df023-2e4a-495e-bbef-4a043c661f46-kube-api-access-qj629\") pod \"barbican-worker-7dd7c6f4ff-m4d2l\" (UID: \"f92df023-2e4a-495e-bbef-4a043c661f46\") " pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.704393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"dnsmasq-dns-85ff748b95-m576l\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779431 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.779511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.812600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.853092 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.860717 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.892821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.894976 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.898228 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.903048 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.920027 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.924280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"barbican-api-5fbbfd856b-vgvjg\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.927190 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:33 crc kubenswrapper[4781]: I0227 00:27:33.971959 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010573 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010656 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.010843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.011038 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") pod \"555d083f-48ec-4cf2-922f-211c99af51be\" (UID: \"555d083f-48ec-4cf2-922f-211c99af51be\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.037079 4781 generic.go:334] "Generic (PLEG): container finished" podID="85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f" containerID="69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf" exitCode=0 Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.037174 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerDied","Data":"69c8105b8323ade72457bd14497f65a1085bea33148e01ee2cbcfb2de3687cdf"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057436 4781 generic.go:334] "Generic (PLEG): container finished" podID="555d083f-48ec-4cf2-922f-211c99af51be" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" exitCode=0 Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5d6jk" event={"ID":"555d083f-48ec-4cf2-922f-211c99af51be","Type":"ContainerDied","Data":"e585d85b515ebdb2e3ddbc1e6c665f7d63c4b4ae71b72a576899c9774906e6b5"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.057916 4781 scope.go:117] "RemoveContainer" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.079018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f" (OuterVolumeSpecName: "kube-api-access-hxl4f") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "kube-api-access-hxl4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.125388 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxl4f\" (UniqueName: \"kubernetes.io/projected/555d083f-48ec-4cf2-922f-211c99af51be-kube-api-access-hxl4f\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.144570 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerStarted","Data":"fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.156225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.156265 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"21b15cb407945a01adc26829ab99f15cd9c656e66d81cf610b3118b8b9526261"} Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.164215 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56459cf68c-4q7c8"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.165834 4781 scope.go:117] "RemoveContainer" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.204185 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.204164589 podStartE2EDuration="5.204164589s" podCreationTimestamp="2026-02-27 00:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:34.202671439 +0000 UTC m=+1323.460210993" watchObservedRunningTime="2026-02-27 00:27:34.204164589 +0000 UTC m=+1323.461704143" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.283957 4781 scope.go:117] "RemoveContainer" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: E0227 00:27:34.286586 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": container with ID starting with 4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2 not found: ID does not exist" containerID="4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.286695 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2"} err="failed to get container status \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": rpc error: code = NotFound desc = could not find container \"4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2\": container with ID starting with 4167de23c8e92b3c4e7bea2c7dcffa9a6588134dc57cd8822b8e44fa6c1099d2 not found: ID does not exist" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.286721 4781 scope.go:117] "RemoveContainer" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: E0227 00:27:34.287876 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": container with ID starting with bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15 not found: ID does not exist" containerID="bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.287909 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15"} err="failed to get container status \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": rpc error: code = NotFound desc = could not find container \"bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15\": container with ID starting with bba0de3b0a693253442f20a5772ff5d51a195d6366bc0feba6d0fb4dc5446a15 not found: ID does not exist" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.445463 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.494069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.494752 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config" (OuterVolumeSpecName: "config") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.509589 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.509781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "555d083f-48ec-4cf2-922f-211c99af51be" (UID: "555d083f-48ec-4cf2-922f-211c99af51be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536151 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536452 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536462 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536471 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.536482 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555d083f-48ec-4cf2-922f-211c99af51be-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.729702 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.746602 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5d6jk"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.779001 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fff4854c8-ttzsm"] Feb 27 00:27:34 crc kubenswrapper[4781]: W0227 00:27:34.788117 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41039943_96a7_4fe6_8b66_0d64cd12a1fa.slice/crio-8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca WatchSource:0}: Error finding container 8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca: Status 404 returned error can't find the container with id 8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.798007 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dd7c6f4ff-m4d2l"] Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.890284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.946980 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947046 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.947174 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.950049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.950148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") pod \"aef65495-ecb2-4396-bb05-a4c5ee48f291\" (UID: \"aef65495-ecb2-4396-bb05-a4c5ee48f291\") " Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.954219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.960501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.960713 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts" (OuterVolumeSpecName: "scripts") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:34 crc kubenswrapper[4781]: I0227 00:27:34.969955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc" (OuterVolumeSpecName: "kube-api-access-tvkqc") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "kube-api-access-tvkqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.037077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.045938 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.046263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data" (OuterVolumeSpecName: "config-data") pod "aef65495-ecb2-4396-bb05-a4c5ee48f291" (UID: "aef65495-ecb2-4396-bb05-a4c5ee48f291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:35 crc kubenswrapper[4781]: W0227 00:27:35.051426 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f24c54_4f24_4f97_a01a_04640bf67b0f.slice/crio-abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6 WatchSource:0}: Error finding container abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6: Status 404 returned error can't find the container with id abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6 Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055607 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055662 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvkqc\" (UniqueName: \"kubernetes.io/projected/aef65495-ecb2-4396-bb05-a4c5ee48f291-kube-api-access-tvkqc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055677 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055690 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055700 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aef65495-ecb2-4396-bb05-a4c5ee48f291-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.055711 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aef65495-ecb2-4396-bb05-a4c5ee48f291-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.056239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.176137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerStarted","Data":"754e7671d1990c27612d0957bd563a0b4f17011e98b48fda1600802520e76182"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56459cf68c-4q7c8" event={"ID":"2467458a-476f-460f-a6ce-144d7304476d","Type":"ContainerStarted","Data":"ff6f7d298294bd4b03eba521b7541e4de877f502be33db785bbf690ed8409bf3"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56459cf68c-4q7c8" event={"ID":"2467458a-476f-460f-a6ce-144d7304476d","Type":"ContainerStarted","Data":"024f4985a838887352d7012c1a10f2d744a1369f0eeee5b41465456991d59d79"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.178574 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.183870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"d556255b7fcd41dba0cdb3d082ddcb7ae02f6232b565d8f11f4719ce58109ffc"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.201072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"45b721ee40c522c5e8c9429e4acd1b60e74a7dda2c29b7520c9f497aec09c91f"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.220180 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56459cf68c-4q7c8" podStartSLOduration=3.2201613 podStartE2EDuration="3.2201613s" podCreationTimestamp="2026-02-27 00:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:35.216432092 +0000 UTC m=+1324.473971656" watchObservedRunningTime="2026-02-27 00:27:35.2201613 +0000 UTC m=+1324.477700854" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229409 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vlp4" event={"ID":"aef65495-ecb2-4396-bb05-a4c5ee48f291","Type":"ContainerDied","Data":"77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229452 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77049757ad8c9d1e53f2546542f34ddf95b52b836b4034f26af7417bb129d6d8" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.229417 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vlp4" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.239692 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.247376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"8435f854a42c84590af86fd91c77e61b452c0ad591c3cb3add91ab407c060fca"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerStarted","Data":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252934 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.252952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.262686 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263678 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="init" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263703 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="init" Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263736 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263746 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: E0227 00:27:35.263761 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.263769 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.264024 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" containerName="cinder-db-sync" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.264048 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="555d083f-48ec-4cf2-922f-211c99af51be" containerName="dnsmasq-dns" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.269879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278767 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278883 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.278909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5hsdr" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.279017 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.296110 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76c479bbf8-lkpd7" podStartSLOduration=3.296094016 podStartE2EDuration="3.296094016s" podCreationTimestamp="2026-02-27 00:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:35.279601701 +0000 UTC m=+1324.537141265" watchObservedRunningTime="2026-02-27 00:27:35.296094016 +0000 UTC m=+1324.553633560" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.296516 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.352516 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555d083f-48ec-4cf2-922f-211c99af51be" path="/var/lib/kubelet/pods/555d083f-48ec-4cf2-922f-211c99af51be/volumes" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.353670 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398866 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.398900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399002 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399025 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.399125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.429960 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.438644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.467229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501201 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501332 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501429 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.501443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.504250 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.568557 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.571059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.574925 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606132 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.606956 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607136 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.607174 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.688588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.688621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689108 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689582 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"cinder-scheduler-0\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.689948 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690357 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.690677 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.698034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710831 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710861 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.710885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711342 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.711993 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"dnsmasq-dns-5c9776ccc5-btbp6\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813533 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.813955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.833527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838508 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.838650 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.839037 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.844598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"cinder-api-0\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.893085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.927045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:35 crc kubenswrapper[4781]: I0227 00:27:35.953506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.294753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.295007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerStarted","Data":"ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.298240 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.298339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.303577 4781 generic.go:334] "Generic (PLEG): container finished" podID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerID="855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193" exitCode=0 Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.304656 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerDied","Data":"855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193"} Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.365175 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podStartSLOduration=3.365151739 podStartE2EDuration="3.365151739s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:36.335240468 +0000 UTC m=+1325.592780022" watchObservedRunningTime="2026-02-27 00:27:36.365151739 +0000 UTC m=+1325.622691293" Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.533421 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.643564 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:36 crc kubenswrapper[4781]: I0227 00:27:36.819445 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:38 crc kubenswrapper[4781]: W0227 00:27:38.968424 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e8c990_7ee9_4f45_91cd_3b49bffbe639.slice/crio-046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a WatchSource:0}: Error finding container 046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a: Status 404 returned error can't find the container with id 046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a Feb 27 00:27:39 crc kubenswrapper[4781]: W0227 00:27:39.005592 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ba4e2f_1bdf_4f98_a4f6_16c12df07d27.slice/crio-7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140 WatchSource:0}: Error finding container 7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140: Status 404 returned error can't find the container with id 7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140 Feb 27 00:27:39 crc kubenswrapper[4781]: W0227 00:27:39.013617 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a90e98_bb9f_436d_9a1c_8aebd91000e3.slice/crio-f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7 WatchSource:0}: Error finding container f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7: Status 404 returned error can't find the container with id f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7 Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.212061 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286781 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.286895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287035 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.287176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") pod \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\" (UID: \"cda4fb4c-7510-49d2-b7bb-2a61c669bacd\") " Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.302662 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm" (OuterVolumeSpecName: "kube-api-access-twcwm") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "kube-api-access-twcwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.366263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerStarted","Data":"f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.367814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.371004 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m576l" event={"ID":"cda4fb4c-7510-49d2-b7bb-2a61c669bacd","Type":"ContainerDied","Data":"754e7671d1990c27612d0957bd563a0b4f17011e98b48fda1600802520e76182"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378153 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m576l" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.378181 4781 scope.go:117] "RemoveContainer" containerID="855ac7a49dcfb27210a6b4627deec5ef2b8dada97c06c16142807b4ec54a5193" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.389749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"7c199541b9c842fbd78de05b7b58ee7fd9ba33f171300f536207f3f7cedd9d3e"} Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.391973 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twcwm\" (UniqueName: \"kubernetes.io/projected/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-kube-api-access-twcwm\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.466808 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.473250 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.478948 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config" (OuterVolumeSpecName: "config") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.485376 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494219 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494250 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494260 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.494292 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.495144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cda4fb4c-7510-49d2-b7bb-2a61c669bacd" (UID: "cda4fb4c-7510-49d2-b7bb-2a61c669bacd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.596422 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cda4fb4c-7510-49d2-b7bb-2a61c669bacd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.726339 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.752249 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:39 crc kubenswrapper[4781]: I0227 00:27:39.773307 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m576l"] Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.356952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.357004 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.398442 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:40 crc kubenswrapper[4781]: E0227 00:27:40.398903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.398919 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.399102 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" containerName="init" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.400240 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.404833 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.405039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.426150 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"4111b413c8b691f7a54045bb36452f81012bdf07d4cada4fdae2d7b3ddfe3237"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.426186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" event={"ID":"41039943-96a7-4fe6-8b66-0d64cd12a1fa","Type":"ContainerStarted","Data":"af5803adc1a48a1ca69fb2089b5d654625ecfb4fd451473c8e9a3e463b8767d4"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.434427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"b9d473355f3d57f0e6e5327867c207fe882474bb2e3fa96e04a968bddf05b4e9"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.434519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" event={"ID":"f92df023-2e4a-495e-bbef-4a043c661f46","Type":"ContainerStarted","Data":"c375d6b34e7ff931edc33e6ab35ecf11dbe7b62ea5e8fd59cb9e0c69680c4757"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.440543 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.459555 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6fff4854c8-ttzsm" podStartSLOduration=3.212790004 podStartE2EDuration="7.459533087s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="2026-02-27 00:27:34.810711773 +0000 UTC m=+1324.068251327" lastFinishedPulling="2026-02-27 00:27:39.057454856 +0000 UTC m=+1328.314994410" observedRunningTime="2026-02-27 00:27:40.440821112 +0000 UTC m=+1329.698360666" watchObservedRunningTime="2026-02-27 00:27:40.459533087 +0000 UTC m=+1329.717072641" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.478525 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.484046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f","Type":"ContainerStarted","Data":"b28843ddc0eeedac24aea963235ae5c3e5d9e83cd06600a666e30355e28fcc9b"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.488340 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7dd7c6f4ff-m4d2l" podStartSLOduration=3.24144135 podStartE2EDuration="7.488317447s" podCreationTimestamp="2026-02-27 00:27:33 +0000 UTC" firstStartedPulling="2026-02-27 00:27:34.81060948 +0000 UTC m=+1324.068149034" lastFinishedPulling="2026-02-27 00:27:39.057485577 +0000 UTC m=+1328.315025131" observedRunningTime="2026-02-27 00:27:40.465999748 +0000 UTC m=+1329.723539302" watchObservedRunningTime="2026-02-27 00:27:40.488317447 +0000 UTC m=+1329.745857001" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.494119 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.502582 4781 generic.go:334] "Generic (PLEG): container finished" podID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerID="5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe" exitCode=0 Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.502701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.507752 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522298 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522441 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522682 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522724 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.522819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.545165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerStarted","Data":"6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f"} Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.546409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.546506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.563063 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=34.563047782 podStartE2EDuration="34.563047782s" podCreationTimestamp="2026-02-27 00:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:40.5569343 +0000 UTC m=+1329.814473874" watchObservedRunningTime="2026-02-27 00:27:40.563047782 +0000 UTC m=+1329.820587336" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.625386 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-l9w6z" podStartSLOduration=3.890576528 podStartE2EDuration="1m2.625367298s" podCreationTimestamp="2026-02-27 00:26:38 +0000 UTC" firstStartedPulling="2026-02-27 00:26:40.793529045 +0000 UTC m=+1270.051068599" lastFinishedPulling="2026-02-27 00:27:39.528319815 +0000 UTC m=+1328.785859369" observedRunningTime="2026-02-27 00:27:40.619886353 +0000 UTC m=+1329.877425907" watchObservedRunningTime="2026-02-27 00:27:40.625367298 +0000 UTC m=+1329.882906852" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.625961 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626172 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626226 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.626285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.630073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582fee51-d9df-4150-b217-889f2f4f8852-logs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.634912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.635414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-internal-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.636448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-config-data-custom\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.637221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-public-tls-certs\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.641511 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582fee51-d9df-4150-b217-889f2f4f8852-combined-ca-bundle\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.649942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qcq\" (UniqueName: \"kubernetes.io/projected/582fee51-d9df-4150-b217-889f2f4f8852-kube-api-access-n7qcq\") pod \"barbican-api-9fcdb6594-94vkn\" (UID: \"582fee51-d9df-4150-b217-889f2f4f8852\") " pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:40 crc kubenswrapper[4781]: I0227 00:27:40.736191 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.337393 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda4fb4c-7510-49d2-b7bb-2a61c669bacd" path="/var/lib/kubelet/pods/cda4fb4c-7510-49d2-b7bb-2a61c669bacd/volumes" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.402057 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9fcdb6594-94vkn"] Feb 27 00:27:41 crc kubenswrapper[4781]: W0227 00:27:41.420798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod582fee51_d9df_4150_b217_889f2f4f8852.slice/crio-12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784 WatchSource:0}: Error finding container 12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784: Status 404 returned error can't find the container with id 12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.593099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"12d9d33c6a9d179d79a7973dee43ecbdf0803cff1fa39e9e98b5a1c252b40784"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.596580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerStarted","Data":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.596756 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" containerID="cri-o://29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" gracePeriod=30 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.597762 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.597845 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" containerID="cri-o://763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" gracePeriod=30 Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.610546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.614682 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerStarted","Data":"8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24"} Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.614722 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.641184 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.641165664 podStartE2EDuration="6.641165664s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:41.637962519 +0000 UTC m=+1330.895502083" watchObservedRunningTime="2026-02-27 00:27:41.641165664 +0000 UTC m=+1330.898705218" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.665352 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" podStartSLOduration=6.665318962 podStartE2EDuration="6.665318962s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:41.659474977 +0000 UTC m=+1330.917014871" watchObservedRunningTime="2026-02-27 00:27:41.665318962 +0000 UTC m=+1330.922858516" Feb 27 00:27:41 crc kubenswrapper[4781]: I0227 00:27:41.969991 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.628007 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.637874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"64905d7f8814a6f41685585d47354bca5f1dd631fafcd1e8f96fea6ccb13b368"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.637920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9fcdb6594-94vkn" event={"ID":"582fee51-d9df-4150-b217-889f2f4f8852","Type":"ContainerStarted","Data":"7ae906019bba9d2b84dbd14808c0e069fccf6224d243961e67e2f805a0b64d72"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.638828 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.638859 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656020 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" exitCode=0 Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656050 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" exitCode=143 Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27","Type":"ContainerDied","Data":"7688a82f556cb3de2b0b7afd752b5b83e46b7cf5458a44e7fd021d53d5d1b140"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656145 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.656266 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.662816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerStarted","Data":"b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde"} Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.663349 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.663366 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.706806 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.708472 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9fcdb6594-94vkn" podStartSLOduration=2.70844983 podStartE2EDuration="2.70844983s" podCreationTimestamp="2026-02-27 00:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:42.6955728 +0000 UTC m=+1331.953112354" watchObservedRunningTime="2026-02-27 00:27:42.70844983 +0000 UTC m=+1331.965989384" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.736050 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.794091965 podStartE2EDuration="7.736034079s" podCreationTimestamp="2026-02-27 00:27:35 +0000 UTC" firstStartedPulling="2026-02-27 00:27:38.998260073 +0000 UTC m=+1328.255799627" lastFinishedPulling="2026-02-27 00:27:39.940202197 +0000 UTC m=+1329.197741741" observedRunningTime="2026-02-27 00:27:42.722311077 +0000 UTC m=+1331.979850651" watchObservedRunningTime="2026-02-27 00:27:42.736034079 +0000 UTC m=+1331.993573633" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.745365 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: E0227 00:27:42.748323 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748362 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} err="failed to get container status \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748384 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: E0227 00:27:42.748766 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748788 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} err="failed to get container status \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748802 4781 scope.go:117] "RemoveContainer" containerID="763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.748991 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078"} err="failed to get container status \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": rpc error: code = NotFound desc = could not find container \"763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078\": container with ID starting with 763dc70e202ffae8c7a4299d667760a89c08506825de70693e6f26bf51747078 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.749005 4781 scope.go:117] "RemoveContainer" containerID="29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.749204 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280"} err="failed to get container status \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": rpc error: code = NotFound desc = could not find container \"29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280\": container with ID starting with 29e8f12fefa758a857d90ef4fb71256760930cd0ef0c749cfad385e52f207280 not found: ID does not exist" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786540 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786694 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786773 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.786885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") pod \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\" (UID: \"d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27\") " Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.792755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl" (OuterVolumeSpecName: "kube-api-access-g5fsl") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "kube-api-access-g5fsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.793085 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.793833 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs" (OuterVolumeSpecName: "logs") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.794943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts" (OuterVolumeSpecName: "scripts") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.795740 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.819936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.845710 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data" (OuterVolumeSpecName: "config-data") pod "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" (UID: "d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889494 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889530 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889543 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889551 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5fsl\" (UniqueName: \"kubernetes.io/projected/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-kube-api-access-g5fsl\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889561 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889570 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.889578 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895652 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895696 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.895738 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.896250 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:27:42 crc kubenswrapper[4781]: I0227 00:27:42.896312 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" gracePeriod=600 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.002106 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.019772 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028061 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: E0227 00:27:43.028558 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028578 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: E0227 00:27:43.028590 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.028597 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.031867 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api-log" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.031921 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" containerName="cinder-api" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.034078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.036499 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037585 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037783 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.037900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.167408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197920 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.197995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.198040 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.243881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.264420 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.304898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305074 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.305147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.311606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1cb0bf7e-097c-4c30-b0e6-224090588da2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.312426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.317210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.340776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb0bf7e-097c-4c30-b0e6-224090588da2-logs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.347040 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-scripts\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.348519 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data-custom\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.360026 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27" path="/var/lib/kubelet/pods/d4ba4e2f-1bdf-4f98-a4f6-16c12df07d27/volumes" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.362969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.396778 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb0bf7e-097c-4c30-b0e6-224090588da2-config-data\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.419249 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxsc\" (UniqueName: \"kubernetes.io/projected/1cb0bf7e-097c-4c30-b0e6-224090588da2-kube-api-access-pcxsc\") pod \"cinder-api-0\" (UID: \"1cb0bf7e-097c-4c30-b0e6-224090588da2\") " pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.510426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.510693 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" containerID="cri-o://994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" gracePeriod=30 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.511374 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" containerID="cri-o://d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" gracePeriod=30 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.527844 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.544433 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.549723 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9696/\": EOF" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.577102 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648625 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648689 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648721 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.648955 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.649015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.658672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.690379 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" exitCode=0 Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6"} Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.691193 4781 scope.go:117] "RemoveContainer" containerID="58cd249b96a5284dbe453e012e30bb3f9acbc9ed9b891c6e44075d418edc5ad9" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.750897 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.750981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751019 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751073 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.751141 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.754971 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-internal-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.758363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-ovndb-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.758421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.759952 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-httpd-config\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.767678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrzq\" (UniqueName: \"kubernetes.io/projected/384db6f0-71f1-4926-9e65-5c27eb430325-kube-api-access-pfrzq\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.770038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-public-tls-certs\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.770315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384db6f0-71f1-4926-9e65-5c27eb430325-combined-ca-bundle\") pod \"neutron-56f5d76fc7-rbhdd\" (UID: \"384db6f0-71f1-4926-9e65-5c27eb430325\") " pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:43 crc kubenswrapper[4781]: I0227 00:27:43.863409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.227195 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.661613 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f5d76fc7-rbhdd"] Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.717023 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"a3c8473f7c00f6f8d6ad0b5909fd3122ed20edb94776a38bb334741f178ba1b2"} Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.721340 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"c72f5df6adad889cfddb9f3c11e18af222ff251cf27f930b227141e5f1669d89"} Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.737235 4781 generic.go:334] "Generic (PLEG): container finished" podID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerID="d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" exitCode=0 Feb 27 00:27:44 crc kubenswrapper[4781]: I0227 00:27:44.737302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.029227 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b48494fc7-447pr" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.176:9696/\": dial tcp 10.217.0.176:9696: connect: connection refused" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.503858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.770620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"c2f402a28d29e4c07e4caf87ce15c5ce23b22ea1ee6fcef0fd84ea91e0276827"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.772567 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"e18093c34186b89284f05d20d70c61677202602e6d87793470b7ec47e6c4f2d1"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.772594 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f5d76fc7-rbhdd" event={"ID":"384db6f0-71f1-4926-9e65-5c27eb430325","Type":"ContainerStarted","Data":"e0bdefffb66c07dedeef73f1b0f37060ecf0ce4e0424b4701aba3fd0711d0981"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.773992 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.780728 4781 generic.go:334] "Generic (PLEG): container finished" podID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerID="6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f" exitCode=0 Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.780811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerDied","Data":"6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f"} Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.802362 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56f5d76fc7-rbhdd" podStartSLOduration=2.8023068650000003 podStartE2EDuration="2.802306865s" podCreationTimestamp="2026-02-27 00:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:45.79606681 +0000 UTC m=+1335.053606364" watchObservedRunningTime="2026-02-27 00:27:45.802306865 +0000 UTC m=+1335.059846419" Feb 27 00:27:45 crc kubenswrapper[4781]: I0227 00:27:45.954425 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.055499 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.271227 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.792163 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1cb0bf7e-097c-4c30-b0e6-224090588da2","Type":"ContainerStarted","Data":"186b7702c1547bb9b47df8a7f0efdae4d5d5c863e56c694361ce54ad078e236b"} Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.859572 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:46 crc kubenswrapper[4781]: I0227 00:27:46.863240 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.863220442 podStartE2EDuration="4.863220442s" podCreationTimestamp="2026-02-27 00:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:46.829551443 +0000 UTC m=+1336.087091027" watchObservedRunningTime="2026-02-27 00:27:46.863220442 +0000 UTC m=+1336.120759996" Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.802845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.803222 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" containerID="cri-o://41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" gracePeriod=30 Feb 27 00:27:47 crc kubenswrapper[4781]: I0227 00:27:47.803647 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" containerID="cri-o://b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" gracePeriod=30 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.812541 4781 generic.go:334] "Generic (PLEG): container finished" podID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerID="994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.812609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada"} Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.814589 4781 generic.go:334] "Generic (PLEG): container finished" podID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerID="b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.814611 4781 generic.go:334] "Generic (PLEG): container finished" podID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerID="41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" exitCode=0 Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.815527 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde"} Feb 27 00:27:48 crc kubenswrapper[4781]: I0227 00:27:48.815553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d"} Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.830620 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-l9w6z" event={"ID":"2274af64-0743-4ede-8fb8-e2ed801638ac","Type":"ContainerDied","Data":"6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f"} Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.830890 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f" Feb 27 00:27:49 crc kubenswrapper[4781]: I0227 00:27:49.915690 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.000744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") pod \"2274af64-0743-4ede-8fb8-e2ed801638ac\" (UID: \"2274af64-0743-4ede-8fb8-e2ed801638ac\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.015865 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts" (OuterVolumeSpecName: "scripts") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.018779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs" (OuterVolumeSpecName: "certs") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.023169 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7" (OuterVolumeSpecName: "kube-api-access-bwsv7") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "kube-api-access-bwsv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.047966 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data" (OuterVolumeSpecName: "config-data") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.059735 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2274af64-0743-4ede-8fb8-e2ed801638ac" (UID: "2274af64-0743-4ede-8fb8-e2ed801638ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105585 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwsv7\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-kube-api-access-bwsv7\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105648 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105659 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105677 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2274af64-0743-4ede-8fb8-e2ed801638ac-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.105686 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2274af64-0743-4ede-8fb8-e2ed801638ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.572399 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.579825 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722815 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.722981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723095 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723691 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") pod \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\" (UID: \"2138a247-a569-4ed6-91a9-5dde2a0b5fa9\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.723802 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") pod \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\" (UID: \"c9e8c990-7ee9-4f45-91cd-3b49bffbe639\") " Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.724708 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.726362 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7" (OuterVolumeSpecName: "kube-api-access-h4zg7") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "kube-api-access-h4zg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.735210 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts" (OuterVolumeSpecName: "scripts") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.735877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.741786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c" (OuterVolumeSpecName: "kube-api-access-87l2c") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "kube-api-access-87l2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.744760 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826414 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826446 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826456 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826465 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4zg7\" (UniqueName: \"kubernetes.io/projected/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-kube-api-access-h4zg7\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826476 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.826485 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87l2c\" (UniqueName: \"kubernetes.io/projected/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-kube-api-access-87l2c\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.843062 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858374 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b48494fc7-447pr" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858432 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b48494fc7-447pr" event={"ID":"2138a247-a569-4ed6-91a9-5dde2a0b5fa9","Type":"ContainerDied","Data":"8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc"} Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.858572 4781 scope.go:117] "RemoveContainer" containerID="d7c09d305d22e97d0875bde304e390f511aac9300a440daba221eab217d0ec4d" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862268 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-l9w6z" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862516 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.862822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c9e8c990-7ee9-4f45-91cd-3b49bffbe639","Type":"ContainerDied","Data":"046131775b2694d2a879762121c97dfa31aa0232cc2f2fd28fe9bc50e7ab867a"} Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.881955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.895765 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.908168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.916017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929413 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929465 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929476 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.929484 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.966873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config" (OuterVolumeSpecName: "config") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.977682 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:50 crc kubenswrapper[4781]: I0227 00:27:50.977919 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" containerID="cri-o://17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" gracePeriod=10 Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.003884 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.041537 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.075757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2138a247-a569-4ed6-91a9-5dde2a0b5fa9" (UID: "2138a247-a569-4ed6-91a9-5dde2a0b5fa9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.127720 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128114 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128131 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128141 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128146 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128190 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.128211 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128217 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128395 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" containerName="cloudkitty-db-sync" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128408 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-api" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" containerName="neutron-httpd" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128428 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="probe" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.128448 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" containerName="cinder-scheduler" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.134351 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141055 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141440 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141643 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.141759 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.143083 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2138a247-a569-4ed6-91a9-5dde2a0b5fa9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.151654 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data" (OuterVolumeSpecName: "config-data") pod "c9e8c990-7ee9-4f45-91cd-3b49bffbe639" (UID: "c9e8c990-7ee9-4f45-91cd-3b49bffbe639"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.174697 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256274 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256316 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.256607 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8c990-7ee9-4f45-91cd-3b49bffbe639-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.287937 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.306331 4781 scope.go:117] "RemoveContainer" containerID="994246fa04a777c2f0ceb85d5b3e476072c41f89030472fc48f602b083a3eada" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.308696 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.340859 4781 scope.go:117] "RemoveContainer" containerID="b553522f28c5c9228f18ef94c90af74eec79bd23cb9493d672e1a7c999be2dde" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.340863 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e8c990-7ee9-4f45-91cd-3b49bffbe639" path="/var/lib/kubelet/pods/c9e8c990-7ee9-4f45-91cd-3b49bffbe639/volumes" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.341591 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.347759 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b48494fc7-447pr"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.357759 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.359966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.361286 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362065 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.362811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.363785 4781 scope.go:117] "RemoveContainer" containerID="41d9cebd94820355907237ef05a02051691148a4c837c57f7086e2f710d9721d" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.365833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.366620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.367393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.367443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.368985 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.384687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"cloudkitty-storageinit-g672n\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464397 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.464529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.465447 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.465473 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: E0227 00:27:51.490615 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2274af64_0743_4ede_8fb8_e2ed801638ac.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2138a247_a569_4ed6_91a9_5dde2a0b5fa9.slice/crio-8cfc8b26590e03ab4b9d1a7221cd85bef307e38eb533c1221abe3eafc0089adc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2138a247_a569_4ed6_91a9_5dde2a0b5fa9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd15e642_6664_416f_ac4e_9cddc96e5642.slice/crio-conmon-17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2274af64_0743_4ede_8fb8_e2ed801638ac.slice/crio-6d62d5f9e32bc3adf9e5c830b2c7fb23773647380ed0a769526c60e85872b03f\": RecentStats: unable to find data in memory cache]" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.568993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569359 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569491 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16cb4c6c-2ddb-41e0-8db3-f44961445474-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.569687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.573293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-scripts\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.573781 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.574175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-config-data\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.574492 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.577917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cb4c6c-2ddb-41e0-8db3-f44961445474-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.587794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9plz\" (UniqueName: \"kubernetes.io/projected/16cb4c6c-2ddb-41e0-8db3-f44961445474-kube-api-access-d9plz\") pod \"cinder-scheduler-0\" (UID: \"16cb4c6c-2ddb-41e0-8db3-f44961445474\") " pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.589871 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671742 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671838 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.671993 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.672084 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") pod \"dd15e642-6664-416f-ac4e-9cddc96e5642\" (UID: \"dd15e642-6664-416f-ac4e-9cddc96e5642\") " Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.676650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5" (OuterVolumeSpecName: "kube-api-access-rxql5") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "kube-api-access-rxql5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.678146 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.729264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.753125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.756669 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config" (OuterVolumeSpecName: "config") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.771894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774197 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxql5\" (UniqueName: \"kubernetes.io/projected/dd15e642-6664-416f-ac4e-9cddc96e5642-kube-api-access-rxql5\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774225 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774234 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774242 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.774250 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.787326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd15e642-6664-416f-ac4e-9cddc96e5642" (UID: "dd15e642-6664-416f-ac4e-9cddc96e5642"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.876406 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd15e642-6664-416f-ac4e-9cddc96e5642-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.890944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerStarted","Data":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891309 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" containerID="cri-o://a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891457 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891480 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" containerID="cri-o://e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.891542 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" containerID="cri-o://b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" gracePeriod=30 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.916928 4781 generic.go:334] "Generic (PLEG): container finished" podID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" exitCode=0 Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.916991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" event={"ID":"dd15e642-6664-416f-ac4e-9cddc96e5642","Type":"ContainerDied","Data":"9e67430f08589dcb2cfca360edd38ce35b2b7fe28eecbb76ca402ae3e309ab2c"} Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917035 4781 scope.go:117] "RemoveContainer" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.917146 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bp4v9" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.965937 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.970334 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.973455 4781 scope.go:117] "RemoveContainer" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.977138 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bp4v9"] Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.979468 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:51 crc kubenswrapper[4781]: I0227 00:27:51.997431 4781 scope.go:117] "RemoveContainer" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:52 crc kubenswrapper[4781]: E0227 00:27:52.017792 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": container with ID starting with 17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf not found: ID does not exist" containerID="17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.017836 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf"} err="failed to get container status \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": rpc error: code = NotFound desc = could not find container \"17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf\": container with ID starting with 17f046305517b35fcbe0f2929bf55b2ebd9a50075d046746605c3998a3b81daf not found: ID does not exist" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.017863 4781 scope.go:117] "RemoveContainer" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:52 crc kubenswrapper[4781]: E0227 00:27:52.025178 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": container with ID starting with 90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5 not found: ID does not exist" containerID="90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.026330 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5"} err="failed to get container status \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": rpc error: code = NotFound desc = could not find container \"90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5\": container with ID starting with 90954f2997216aabd438b4d76ca15d61a674e3f9cbf71c7c021a82a58f29b4b5 not found: ID does not exist" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.165458 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.326906 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 00:27:52 crc kubenswrapper[4781]: W0227 00:27:52.329091 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16cb4c6c_2ddb_41e0_8db3_f44961445474.slice/crio-b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5 WatchSource:0}: Error finding container b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5: Status 404 returned error can't find the container with id b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972363 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" exitCode=0 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972733 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" exitCode=2 Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972730 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.972794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.976196 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"4e0ede3a6498fb18b0850e460c17347bbbbbad2912a10979839ed6832112689b"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.976248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"b23fcde744d6f232272b068172acc365fc9f6d2c6112861ce3acf5dc8a4f39a5"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.977774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerStarted","Data":"7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.977805 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerStarted","Data":"5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602"} Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.988954 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:52 crc kubenswrapper[4781]: I0227 00:27:52.994993 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.036416 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-g672n" podStartSLOduration=2.03639958 podStartE2EDuration="2.03639958s" podCreationTimestamp="2026-02-27 00:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:53.000014509 +0000 UTC m=+1342.257554083" watchObservedRunningTime="2026-02-27 00:27:53.03639958 +0000 UTC m=+1342.293939134" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.156275 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9fcdb6594-94vkn" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.240810 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.241009 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" containerID="cri-o://ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" gracePeriod=30 Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.242354 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" containerID="cri-o://0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" gracePeriod=30 Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.470543 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2138a247-a569-4ed6-91a9-5dde2a0b5fa9" path="/var/lib/kubelet/pods/2138a247-a569-4ed6-91a9-5dde2a0b5fa9/volumes" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.471478 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" path="/var/lib/kubelet/pods/dd15e642-6664-416f-ac4e-9cddc96e5642/volumes" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.790284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894366 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894767 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894926 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.894962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") pod \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\" (UID: \"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d\") " Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895417 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.895972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.921904 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts" (OuterVolumeSpecName: "scripts") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.938790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg" (OuterVolumeSpecName: "kube-api-access-9qwfg") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "kube-api-access-9qwfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:53 crc kubenswrapper[4781]: I0227 00:27:53.981722 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.001989 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qwfg\" (UniqueName: \"kubernetes.io/projected/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-kube-api-access-9qwfg\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002035 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.002043 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041422 4781 generic.go:334] "Generic (PLEG): container finished" podID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" exitCode=0 Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041487 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c607f0bd-ab23-4fc5-8aa7-437be5e6d59d","Type":"ContainerDied","Data":"8523e5974cb6fe577a148d4d77627c86ea1298c44ff6fdd8db602516c249b5d9"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041533 4781 scope.go:117] "RemoveContainer" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.041672 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.050416 4781 generic.go:334] "Generic (PLEG): container finished" podID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerID="ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" exitCode=143 Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.050741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787"} Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.129257 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.201712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data" (OuterVolumeSpecName: "config-data") pod "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" (UID: "c607f0bd-ab23-4fc5-8aa7-437be5e6d59d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.214819 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.214851 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.236172 4781 scope.go:117] "RemoveContainer" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.265870 4781 scope.go:117] "RemoveContainer" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.297537 4781 scope.go:117] "RemoveContainer" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.297979 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": container with ID starting with e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8 not found: ID does not exist" containerID="e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298010 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8"} err="failed to get container status \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": rpc error: code = NotFound desc = could not find container \"e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8\": container with ID starting with e8ba69d86c8f47c6834df258b24596973100aaaf0d5cd35b93784d1bd516d0f8 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298034 4781 scope.go:117] "RemoveContainer" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.298497 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": container with ID starting with b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717 not found: ID does not exist" containerID="b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298521 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717"} err="failed to get container status \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": rpc error: code = NotFound desc = could not find container \"b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717\": container with ID starting with b26ad8aadc8d9267db46f4b4e8381012905bb17ae767e027260b91762d34d717 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298536 4781 scope.go:117] "RemoveContainer" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.298877 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": container with ID starting with a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892 not found: ID does not exist" containerID="a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.298898 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892"} err="failed to get container status \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": rpc error: code = NotFound desc = could not find container \"a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892\": container with ID starting with a57ddca737b909e7bdd1e80d02f2cf19f6581e0895c36ed2a03b91c68fe41892 not found: ID does not exist" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.393752 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.414258 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427349 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427782 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427800 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427817 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427852 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="init" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427859 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="init" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427871 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427878 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: E0227 00:27:54.427891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.427897 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="sg-core" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428109 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="ceilometer-notification-agent" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428126 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" containerName="proxy-httpd" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.428139 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd15e642-6664-416f-ac4e-9cddc96e5642" containerName="dnsmasq-dns" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.430175 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.433123 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.433466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.445719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537912 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.537978 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538305 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.538462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.639918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.640877 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.641290 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.641358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.644669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.645577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.646188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.648215 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.660823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"ceilometer-0\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " pod="openstack/ceilometer-0" Feb 27 00:27:54 crc kubenswrapper[4781]: I0227 00:27:54.753817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.060272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"16cb4c6c-2ddb-41e0-8db3-f44961445474","Type":"ContainerStarted","Data":"1e9ffcbbd25742c01039bd3a97ac6ddd0b05895a54aef1dbf572bec3de71584f"} Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.082522 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.082478184 podStartE2EDuration="4.082478184s" podCreationTimestamp="2026-02-27 00:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:27:55.078824518 +0000 UTC m=+1344.336364072" watchObservedRunningTime="2026-02-27 00:27:55.082478184 +0000 UTC m=+1344.340017728" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.322806 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c607f0bd-ab23-4fc5-8aa7-437be5e6d59d" path="/var/lib/kubelet/pods/c607f0bd-ab23-4fc5-8aa7-437be5e6d59d/volumes" Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.348059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:27:55 crc kubenswrapper[4781]: W0227 00:27:55.349920 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda732412d_8655_4df0_90ba_1bf854b6d8d1.slice/crio-10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b WatchSource:0}: Error finding container 10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b: Status 404 returned error can't find the container with id 10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b Feb 27 00:27:55 crc kubenswrapper[4781]: I0227 00:27:55.988731 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.079510 4781 generic.go:334] "Generic (PLEG): container finished" podID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerID="7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b" exitCode=0 Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.079595 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerDied","Data":"7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b"} Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.082861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b"} Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.679671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.989053 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:52142->10.217.0.183:9311: read: connection reset by peer" Feb 27 00:27:56 crc kubenswrapper[4781]: I0227 00:27:56.989389 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fbbfd856b-vgvjg" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": read tcp 10.217.0.2:52134->10.217.0.183:9311: read: connection reset by peer" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.101283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.101332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.104649 4781 generic.go:334] "Generic (PLEG): container finished" podID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerID="0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" exitCode=0 Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.104998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1"} Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.396914 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.497935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.498268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") pod \"49f24c54-4f24-4f97-a01a-04640bf67b0f\" (UID: \"49f24c54-4f24-4f97-a01a-04640bf67b0f\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.500174 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs" (OuterVolumeSpecName: "logs") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.516860 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.529469 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w" (OuterVolumeSpecName: "kube-api-access-5d79w") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "kube-api-access-5d79w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.532162 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.575353 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data" (OuterVolumeSpecName: "config-data") pod "49f24c54-4f24-4f97-a01a-04640bf67b0f" (UID: "49f24c54-4f24-4f97-a01a-04640bf67b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600493 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49f24c54-4f24-4f97-a01a-04640bf67b0f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600526 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600538 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600548 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f24c54-4f24-4f97-a01a-04640bf67b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.600570 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d79w\" (UniqueName: \"kubernetes.io/projected/49f24c54-4f24-4f97-a01a-04640bf67b0f-kube-api-access-5d79w\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.603095 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.701981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702022 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702059 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702119 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.702196 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") pod \"87b3198c-30ab-415a-b24b-b26ab3da838e\" (UID: \"87b3198c-30ab-415a-b24b-b26ab3da838e\") " Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.707955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx" (OuterVolumeSpecName: "kube-api-access-zbrfx") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "kube-api-access-zbrfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.711537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs" (OuterVolumeSpecName: "certs") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.719395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts" (OuterVolumeSpecName: "scripts") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.745252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.746484 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data" (OuterVolumeSpecName: "config-data") pod "87b3198c-30ab-415a-b24b-b26ab3da838e" (UID: "87b3198c-30ab-415a-b24b-b26ab3da838e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804393 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbrfx\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-kube-api-access-zbrfx\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804741 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804821 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804884 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/87b3198c-30ab-415a-b24b-b26ab3da838e-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:57 crc kubenswrapper[4781]: I0227 00:27:57.804937 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87b3198c-30ab-415a-b24b-b26ab3da838e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.117500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g672n" event={"ID":"87b3198c-30ab-415a-b24b-b26ab3da838e","Type":"ContainerDied","Data":"5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.118571 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea5e68fe7fb3730a14c055ae47e47a12d2ed4ea16d87ceb507c87aa6e875602" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.118733 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g672n" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.127274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.129825 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbbfd856b-vgvjg" event={"ID":"49f24c54-4f24-4f97-a01a-04640bf67b0f","Type":"ContainerDied","Data":"abf074d9baa2f3d6e8969094139a58da187066e40f9840d7df7ac1542a6fb7f6"} Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.130021 4781 scope.go:117] "RemoveContainer" containerID="0f274fbd09031e3b8e38174b2cbe52a6c6f5f24b60283aea8a0e1a01875fd8b1" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.129924 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbbfd856b-vgvjg" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.162871 4781 scope.go:117] "RemoveContainer" containerID="ee82fae8a491ded998bd0190a2bb94c2ff762013e3316811c1d1f983f5c06787" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.185476 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.199600 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5fbbfd856b-vgvjg"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301297 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301683 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301695 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301712 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301718 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: E0227 00:27:58.301728 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.301735 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302762 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api-log" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302780 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" containerName="barbican-api" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.302791 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" containerName="cloudkitty-storageinit" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.303425 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.309978 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310167 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310471 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.310576 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-qt68h" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.318589 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.376209 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.378616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.400972 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.416836 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.417487 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.520788 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.521862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522232 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522310 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.522987 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523436 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.523586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.545457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"dnsmasq-dns-67bdc55879-5mf9t\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.567027 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.568732 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.573480 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.598607 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625649 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.625713 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.641168 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.641302 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.646305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.646835 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.653999 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.656535 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730487 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730548 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730575 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.730600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.741596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.832352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.833798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.837235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.841426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.853591 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.857239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.861057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.864597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.907595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:27:58 crc kubenswrapper[4781]: I0227 00:27:58.949972 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.322901 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f24c54-4f24-4f97-a01a-04640bf67b0f" path="/var/lib/kubelet/pods/49f24c54-4f24-4f97-a01a-04640bf67b0f/volumes" Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.337508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.510512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.648548 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:27:59 crc kubenswrapper[4781]: W0227 00:27:59.694037 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode66fa513_66e6_4821_ad96_4bfe56e359f1.slice/crio-5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe WatchSource:0}: Error finding container 5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe: Status 404 returned error can't find the container with id 5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe Feb 27 00:27:59 crc kubenswrapper[4781]: I0227 00:27:59.991039 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.122129 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.138177 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.139901 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146215 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146446 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.146695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.155245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184880 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184927 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.184937 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerStarted","Data":"210ddd6c96ce311a55b219a45ca27f47a76e0b915886957693e3838eb107b875"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.185846 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192731 4781 generic.go:334] "Generic (PLEG): container finished" podID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerID="ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d" exitCode=0 Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.192822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerStarted","Data":"ede845938dcbb2c0e3303591186eb47bf17d10a92d1b0dd61b8430ff2dd6aa13"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.196856 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerStarted","Data":"5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe"} Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.227235 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.227213951 podStartE2EDuration="2.227213951s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:00.218013008 +0000 UTC m=+1349.475552562" watchObservedRunningTime="2026-02-27 00:28:00.227213951 +0000 UTC m=+1349.484753505" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.288442 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.390525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.412330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"auto-csr-approver-29535868-f5csp\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.459483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:00 crc kubenswrapper[4781]: I0227 00:28:00.896075 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:28:00 crc kubenswrapper[4781]: W0227 00:28:00.942734 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3df72f1_7ac9_4877_a7b4_a17b5c724303.slice/crio-e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43 WatchSource:0}: Error finding container e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43: Status 404 returned error can't find the container with id e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43 Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.213473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerStarted","Data":"e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.220307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerStarted","Data":"bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.221575 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.227524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerStarted","Data":"e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741"} Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.227563 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.239772 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" podStartSLOduration=3.239753891 podStartE2EDuration="3.239753891s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:01.237581253 +0000 UTC m=+1350.495120827" watchObservedRunningTime="2026-02-27 00:28:01.239753891 +0000 UTC m=+1350.497293445" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.267782 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.636315804 podStartE2EDuration="7.267763461s" podCreationTimestamp="2026-02-27 00:27:54 +0000 UTC" firstStartedPulling="2026-02-27 00:27:55.352065066 +0000 UTC m=+1344.609604620" lastFinishedPulling="2026-02-27 00:27:59.983512723 +0000 UTC m=+1349.241052277" observedRunningTime="2026-02-27 00:28:01.265647155 +0000 UTC m=+1350.523186709" watchObservedRunningTime="2026-02-27 00:28:01.267763461 +0000 UTC m=+1350.525303015" Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.628390 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:01 crc kubenswrapper[4781]: I0227 00:28:01.939416 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.248317 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerStarted","Data":"fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd"} Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250096 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerStarted","Data":"58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605"} Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250599 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" containerID="cri-o://452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" gracePeriod=30 Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.250668 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" containerID="cri-o://2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" gracePeriod=30 Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.273196 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.205754485 podStartE2EDuration="5.273162751s" podCreationTimestamp="2026-02-27 00:27:58 +0000 UTC" firstStartedPulling="2026-02-27 00:27:59.69685193 +0000 UTC m=+1348.954391494" lastFinishedPulling="2026-02-27 00:28:02.764260206 +0000 UTC m=+1352.021799760" observedRunningTime="2026-02-27 00:28:03.26670315 +0000 UTC m=+1352.524242704" watchObservedRunningTime="2026-02-27 00:28:03.273162751 +0000 UTC m=+1352.530702305" Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.307326 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:03 crc kubenswrapper[4781]: I0227 00:28:03.321494 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535868-f5csp" podStartSLOduration=1.518159815 podStartE2EDuration="3.321474987s" podCreationTimestamp="2026-02-27 00:28:00 +0000 UTC" firstStartedPulling="2026-02-27 00:28:00.961639423 +0000 UTC m=+1350.219178967" lastFinishedPulling="2026-02-27 00:28:02.764954585 +0000 UTC m=+1352.022494139" observedRunningTime="2026-02-27 00:28:03.301725916 +0000 UTC m=+1352.559265470" watchObservedRunningTime="2026-02-27 00:28:03.321474987 +0000 UTC m=+1352.579014541" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.120853 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267144 4781 generic.go:334] "Generic (PLEG): container finished" podID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" exitCode=0 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267181 4781 generic.go:334] "Generic (PLEG): container finished" podID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" exitCode=143 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267318 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267855 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267919 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"02a21c78-44f9-4e7a-81cc-8488b0fd942a","Type":"ContainerDied","Data":"210ddd6c96ce311a55b219a45ca27f47a76e0b915886957693e3838eb107b875"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.267976 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.270507 4781 generic.go:334] "Generic (PLEG): container finished" podID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerID="58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605" exitCode=0 Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.270569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerDied","Data":"58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605"} Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.277857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.277935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278075 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278230 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278263 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") pod \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\" (UID: \"02a21c78-44f9-4e7a-81cc-8488b0fd942a\") " Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs" (OuterVolumeSpecName: "logs") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.278842 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a21c78-44f9-4e7a-81cc-8488b0fd942a-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.288778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts" (OuterVolumeSpecName: "scripts") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.313589 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.321114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.321527 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs" (OuterVolumeSpecName: "certs") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.325069 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn" (OuterVolumeSpecName: "kube-api-access-cvdwn") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "kube-api-access-cvdwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.338306 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381128 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381157 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381167 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381176 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvdwn\" (UniqueName: \"kubernetes.io/projected/02a21c78-44f9-4e7a-81cc-8488b0fd942a-kube-api-access-cvdwn\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.381186 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.428822 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data" (OuterVolumeSpecName: "config-data") pod "02a21c78-44f9-4e7a-81cc-8488b0fd942a" (UID: "02a21c78-44f9-4e7a-81cc-8488b0fd942a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.478405 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.478918 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.478992 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} err="failed to get container status \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479024 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.479333 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479381 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} err="failed to get container status \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479410 4781 scope.go:117] "RemoveContainer" containerID="2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479689 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f"} err="failed to get container status \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": rpc error: code = NotFound desc = could not find container \"2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f\": container with ID starting with 2388ad6e20a9a85367a6c9706990347b0761ca1a9c9408dbfc157fbb166ed81f not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479708 4781 scope.go:117] "RemoveContainer" containerID="452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.479922 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9"} err="failed to get container status \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": rpc error: code = NotFound desc = could not find container \"452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9\": container with ID starting with 452768a164fa502f5d71464f28cbab3d7b27dc992e50b59fecea9851c7c480c9 not found: ID does not exist" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.483184 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a21c78-44f9-4e7a-81cc-8488b0fd942a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.503517 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.604050 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.612427 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.624606 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.625206 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625289 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: E0227 00:28:04.625373 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625434 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625682 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api-log" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.625766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" containerName="cloudkitty-api" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.626815 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.628684 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.628848 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.638074 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.644268 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795368 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795455 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.795488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.896934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.897278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.898777 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.902297 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.903088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.907078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.914230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.921305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.922700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.940247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.952291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"cloudkitty-api-0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " pod="openstack/cloudkitty-api-0" Feb 27 00:28:04 crc kubenswrapper[4781]: I0227 00:28:04.978029 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.297903 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" containerID="cri-o://fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" gracePeriod=30 Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.338013 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a21c78-44f9-4e7a-81cc-8488b0fd942a" path="/var/lib/kubelet/pods/02a21c78-44f9-4e7a-81cc-8488b0fd942a/volumes" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.710294 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.852668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.962817 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:05 crc kubenswrapper[4781]: I0227 00:28:05.964511 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.002022 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046582 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046805 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.046838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157807 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157884 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157982 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.157999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.159325 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-logs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.164971 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.165996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-scripts\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.167666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-combined-ca-bundle\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.168711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-config-data\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.168757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-public-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.186666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-internal-tls-certs\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.195272 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrz7\" (UniqueName: \"kubernetes.io/projected/5ff35aa7-7e5a-4069-8dc4-392e01a957e3-kube-api-access-4rrz7\") pod \"placement-6d64c6bb46-jcp5p\" (UID: \"5ff35aa7-7e5a-4069-8dc4-392e01a957e3\") " pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.321499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.321538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"27f0f2f53c09daadc606bc872e1f5df520a0c8f2a01549f894ec755d7a09a157"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327869 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535868-f5csp" event={"ID":"f3df72f1-7ac9-4877-a7b4-a17b5c724303","Type":"ContainerDied","Data":"e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43"} Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327900 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55a7008a665e255411e030af55e4960e0ce58f4f35f88a35bcbcae2103d9e43" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.327947 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535868-f5csp" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.366928 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") pod \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\" (UID: \"f3df72f1-7ac9-4877-a7b4-a17b5c724303\") " Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.371028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz" (OuterVolumeSpecName: "kube-api-access-fb9mz") pod "f3df72f1-7ac9-4877-a7b4-a17b5c724303" (UID: "f3df72f1-7ac9-4877-a7b4-a17b5c724303"). InnerVolumeSpecName "kube-api-access-fb9mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.446235 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.469464 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9mz\" (UniqueName: \"kubernetes.io/projected/f3df72f1-7ac9-4877-a7b4-a17b5c724303-kube-api-access-fb9mz\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.554102 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56459cf68c-4q7c8" Feb 27 00:28:06 crc kubenswrapper[4781]: I0227 00:28:06.959681 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d64c6bb46-jcp5p"] Feb 27 00:28:06 crc kubenswrapper[4781]: W0227 00:28:06.959750 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff35aa7_7e5a_4069_8dc4_392e01a957e3.slice/crio-5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b WatchSource:0}: Error finding container 5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b: Status 404 returned error can't find the container with id 5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.253894 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.266420 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535862-l9vc5"] Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.326854 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411dc0f9-584c-453b-a137-189ab8731570" path="/var/lib/kubelet/pods/411dc0f9-584c-453b-a137-189ab8731570/volumes" Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.343370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"8844002fa649bbfd29001c87d27f71d4621e3669a75ccf85fd1badf27a87e1a6"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.343410 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"5959b2f6e7fff1a2a51f3463d90c725e2eb81981c7fd050dcdeb36b804b2be0b"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.347484 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerStarted","Data":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.347760 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:28:07 crc kubenswrapper[4781]: I0227 00:28:07.385541 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.385523224 podStartE2EDuration="3.385523224s" podCreationTimestamp="2026-02-27 00:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:07.375307534 +0000 UTC m=+1356.632847088" watchObservedRunningTime="2026-02-27 00:28:07.385523224 +0000 UTC m=+1356.643062778" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.365775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d64c6bb46-jcp5p" event={"ID":"5ff35aa7-7e5a-4069-8dc4-392e01a957e3","Type":"ContainerStarted","Data":"e34be0223d33d9c3740d3959a945adda9dff38a4a277f5d5d8122ac8617d942e"} Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.366136 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.366465 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.399502 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d64c6bb46-jcp5p" podStartSLOduration=3.399485211 podStartE2EDuration="3.399485211s" podCreationTimestamp="2026-02-27 00:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:08.388112471 +0000 UTC m=+1357.645652025" watchObservedRunningTime="2026-02-27 00:28:08.399485211 +0000 UTC m=+1357.657024765" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.742797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.798012 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:08 crc kubenswrapper[4781]: I0227 00:28:08.798512 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" containerID="cri-o://8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" gracePeriod=10 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242156 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.242710 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242734 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.242971 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" containerName="oc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.243860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.245585 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.251999 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.252196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.252599 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x24nv" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.348734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.349175 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.349256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.350819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386386 4781 generic.go:334] "Generic (PLEG): container finished" podID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerID="8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" exitCode=0 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" event={"ID":"f2a90e98-bb9f-436d-9a1c-8aebd91000e3","Type":"ContainerDied","Data":"f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.386484 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5601a008ad9454c1a7af70c2d0c5712b2a38f8540f6108d4eb74d5c92b8bcd7" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.388168 4781 generic.go:334] "Generic (PLEG): container finished" podID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerID="fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" exitCode=0 Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.389246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerDied","Data":"fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd"} Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454920 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454944 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.454985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.459177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.463158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.463877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.472178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"openstackclient\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.519833 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.520744 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.542755 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.544409 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.551951 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557306 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557342 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557400 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557536 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") pod \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\" (UID: \"f2a90e98-bb9f-436d-9a1c-8aebd91000e3\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.557608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") pod \"e66fa513-66e6-4821-ad96-4bfe56e359f1\" (UID: \"e66fa513-66e6-4821-ad96-4bfe56e359f1\") " Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558430 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558845 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558877 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="init" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="init" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.558899 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.558906 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.559091 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" containerName="dnsmasq-dns" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.559118 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" containerName="cloudkitty-proc" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.560404 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.565945 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t" (OuterVolumeSpecName: "kube-api-access-5mj2t") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "kube-api-access-5mj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.566799 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts" (OuterVolumeSpecName: "scripts") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.577037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d" (OuterVolumeSpecName: "kube-api-access-4w96d") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "kube-api-access-4w96d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.583494 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.594826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs" (OuterVolumeSpecName: "certs") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.611703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.611972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661118 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661897 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661913 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661928 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661938 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w96d\" (UniqueName: \"kubernetes.io/projected/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-kube-api-access-4w96d\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661947 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.661956 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mj2t\" (UniqueName: \"kubernetes.io/projected/e66fa513-66e6-4821-ad96-4bfe56e359f1-kube-api-access-5mj2t\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.670794 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data" (OuterVolumeSpecName: "config-data") pod "e66fa513-66e6-4821-ad96-4bfe56e359f1" (UID: "e66fa513-66e6-4821-ad96-4bfe56e359f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.671478 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.692889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config" (OuterVolumeSpecName: "config") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.725068 4781 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 00:28:09 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_23ff7bad-67ec-4ef6-b3b9-c997a99a62b4_0(6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5" Netns:"/var/run/netns/c947b9e9-9f00-44f6-85ce-84835c04cc12" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5;K8S_POD_UID=23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4]: expected pod UID "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" but got "02c4875e-e180-4365-a00a-828ab5d95c34" from Kube API Feb 27 00:28:09 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 00:28:09 crc kubenswrapper[4781]: > Feb 27 00:28:09 crc kubenswrapper[4781]: E0227 00:28:09.725599 4781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 00:28:09 crc kubenswrapper[4781]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_23ff7bad-67ec-4ef6-b3b9-c997a99a62b4_0(6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5" Netns:"/var/run/netns/c947b9e9-9f00-44f6-85ce-84835c04cc12" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6bca2a3521cbb1795eee380ff04718caf1ceb0e3915c9a371a67015f320b28d5;K8S_POD_UID=23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4]: expected pod UID "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" but got "02c4875e-e180-4365-a00a-828ab5d95c34" from Kube API Feb 27 00:28:09 crc kubenswrapper[4781]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 00:28:09 crc kubenswrapper[4781]: > pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.725598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.741867 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.744380 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f2a90e98-bb9f-436d-9a1c-8aebd91000e3" (UID: "f2a90e98-bb9f-436d-9a1c-8aebd91000e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.762978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763898 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66fa513-66e6-4821-ad96-4bfe56e359f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.763974 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764040 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764098 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764721 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.764793 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2a90e98-bb9f-436d-9a1c-8aebd91000e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.765614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.766573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.766996 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/02c4875e-e180-4365-a00a-828ab5d95c34-openstack-config-secret\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.778162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rtv\" (UniqueName: \"kubernetes.io/projected/02c4875e-e180-4365-a00a-828ab5d95c34-kube-api-access-l5rtv\") pod \"openstackclient\" (UID: \"02c4875e-e180-4365-a00a-828ab5d95c34\") " pod="openstack/openstackclient" Feb 27 00:28:09 crc kubenswrapper[4781]: I0227 00:28:09.905981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.409730 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411531 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e66fa513-66e6-4821-ad96-4bfe56e359f1","Type":"ContainerDied","Data":"5565ddef70981fe2780f1609d8fa35f56abe3f10d059edc749ee7568f1b9b3fe"} Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411581 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411606 4781 scope.go:117] "RemoveContainer" containerID="fdc3e6d3767980267676c3bb178abb962b9ba33efb09510bb187625fd32978dd" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.411581 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-btbp6" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.424558 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" podUID="02c4875e-e180-4365-a00a-828ab5d95c34" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.466925 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478454 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478587 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") pod \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\" (UID: \"23ff7bad-67ec-4ef6-b3b9-c997a99a62b4\") " Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.478972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.479460 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490431 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.490394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77" (OuterVolumeSpecName: "kube-api-access-pnh77") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "kube-api-access-pnh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.493764 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" (UID: "23ff7bad-67ec-4ef6-b3b9-c997a99a62b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.524499 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.537695 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.561824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.566734 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.570000 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.573208 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-btbp6"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.582255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.585166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.586551 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.587738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.587875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588170 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnh77\" (UniqueName: \"kubernetes.io/projected/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-kube-api-access-pnh77\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588231 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.588286 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.690417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.690994 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.691440 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.695757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.696541 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.697092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.707691 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"cloudkitty-proc-0\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:28:10 crc kubenswrapper[4781]: I0227 00:28:10.941059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.340728 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" path="/var/lib/kubelet/pods/23ff7bad-67ec-4ef6-b3b9-c997a99a62b4/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.342529 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66fa513-66e6-4821-ad96-4bfe56e359f1" path="/var/lib/kubelet/pods/e66fa513-66e6-4821-ad96-4bfe56e359f1/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.343997 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a90e98-bb9f-436d-9a1c-8aebd91000e3" path="/var/lib/kubelet/pods/f2a90e98-bb9f-436d-9a1c-8aebd91000e3/volumes" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.423649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"02c4875e-e180-4365-a00a-828ab5d95c34","Type":"ContainerStarted","Data":"27bb25f3bd1c0aa5e7843534c38a1b39afba807fb84921a9189815b48bf5d197"} Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.431113 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.436539 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="23ff7bad-67ec-4ef6-b3b9-c997a99a62b4" podUID="02c4875e-e180-4365-a00a-828ab5d95c34" Feb 27 00:28:11 crc kubenswrapper[4781]: I0227 00:28:11.491292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:28:11 crc kubenswrapper[4781]: W0227 00:28:11.514794 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb34a476_dd22_4085_bb2c_a8e57b0d9889.slice/crio-3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1 WatchSource:0}: Error finding container 3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1: Status 404 returned error can't find the container with id 3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1 Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.446056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerStarted","Data":"b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a"} Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.446639 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerStarted","Data":"3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1"} Feb 27 00:28:12 crc kubenswrapper[4781]: I0227 00:28:12.467952 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.467936862 podStartE2EDuration="2.467936862s" podCreationTimestamp="2026-02-27 00:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:12.461615711 +0000 UTC m=+1361.719155265" watchObservedRunningTime="2026-02-27 00:28:12.467936862 +0000 UTC m=+1361.725476416" Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.893155 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56f5d76fc7-rbhdd" Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986231 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986475 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5445c56cbd-fmcjz" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" containerID="cri-o://0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" gracePeriod=30 Feb 27 00:28:13 crc kubenswrapper[4781]: I0227 00:28:13.986600 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5445c56cbd-fmcjz" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" containerID="cri-o://24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" gracePeriod=30 Feb 27 00:28:14 crc kubenswrapper[4781]: I0227 00:28:14.516822 4781 generic.go:334] "Generic (PLEG): container finished" podID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerID="24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" exitCode=0 Feb 27 00:28:14 crc kubenswrapper[4781]: I0227 00:28:14.516871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89"} Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.757388 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.769326 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.769424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.772989 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.773087 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.773269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895377 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.895905 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999098 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999161 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999207 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:16 crc kubenswrapper[4781]: I0227 00:28:16.999367 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-log-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8ba5117-540f-448d-aac6-6fde482f5f14-run-httpd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:16.999541 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.000153 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.000244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.007049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-internal-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.007998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-etc-swift\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.008869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-config-data\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.009555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-combined-ca-bundle\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.020180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba5117-540f-448d-aac6-6fde482f5f14-public-tls-certs\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.020560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dfd\" (UniqueName: \"kubernetes.io/projected/e8ba5117-540f-448d-aac6-6fde482f5f14-kube-api-access-52dfd\") pod \"swift-proxy-5c945d84cf-z5v9s\" (UID: \"e8ba5117-540f-448d-aac6-6fde482f5f14\") " pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.107495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.705077 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.705812 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" containerID="cri-o://14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706399 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" containerID="cri-o://e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706441 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" containerID="cri-o://57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.706393 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" containerID="cri-o://41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" gracePeriod=30 Feb 27 00:28:17 crc kubenswrapper[4781]: I0227 00:28:17.718186 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.192:3000/\": EOF" Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.565267 4781 generic.go:334] "Generic (PLEG): container finished" podID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerID="0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.565332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568244 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568262 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" exitCode=2 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568270 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" exitCode=0 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.568307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3"} Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.766871 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.767106 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" containerID="cri-o://5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" gracePeriod=30 Feb 27 00:28:18 crc kubenswrapper[4781]: I0227 00:28:18.767206 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" containerID="cri-o://fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" gracePeriod=30 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.583886 4781 generic.go:334] "Generic (PLEG): container finished" podID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerID="5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" exitCode=143 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.583922 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8"} Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711537 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711843 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" containerID="cri-o://1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" gracePeriod=30 Feb 27 00:28:19 crc kubenswrapper[4781]: I0227 00:28:19.711938 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" containerID="cri-o://16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" gracePeriod=30 Feb 27 00:28:20 crc kubenswrapper[4781]: I0227 00:28:20.597046 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" exitCode=143 Feb 27 00:28:20 crc kubenswrapper[4781]: I0227 00:28:20.597090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} Feb 27 00:28:21 crc kubenswrapper[4781]: I0227 00:28:21.611248 4781 generic.go:334] "Generic (PLEG): container finished" podID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerID="57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" exitCode=0 Feb 27 00:28:21 crc kubenswrapper[4781]: I0227 00:28:21.611339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9"} Feb 27 00:28:22 crc kubenswrapper[4781]: I0227 00:28:22.662995 4781 generic.go:334] "Generic (PLEG): container finished" podID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerID="fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" exitCode=0 Feb 27 00:28:22 crc kubenswrapper[4781]: I0227 00:28:22.663086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.018284 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.102108 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130104 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130139 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130297 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.130387 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") pod \"a732412d-8655-4df0-90ba-1bf854b6d8d1\" (UID: \"a732412d-8655-4df0-90ba-1bf854b6d8d1\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.131931 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.135751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.141791 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts" (OuterVolumeSpecName: "scripts") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.144537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr" (OuterVolumeSpecName: "kube-api-access-2lrpr") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "kube-api-access-2lrpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.173776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.225757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231691 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231770 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231878 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.231972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") pod \"34294cdd-a18f-4453-8d43-c4d1290e3c59\" (UID: \"34294cdd-a18f-4453-8d43-c4d1290e3c59\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232455 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232466 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232476 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a732412d-8655-4df0-90ba-1bf854b6d8d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232484 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232493 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrpr\" (UniqueName: \"kubernetes.io/projected/a732412d-8655-4df0-90ba-1bf854b6d8d1-kube-api-access-2lrpr\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.232500 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.237153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.239799 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp" (OuterVolumeSpecName: "kube-api-access-b22wp") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "kube-api-access-b22wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: W0227 00:28:23.275722 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ba5117_540f_448d_aac6_6fde482f5f14.slice/crio-724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1 WatchSource:0}: Error finding container 724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1: Status 404 returned error can't find the container with id 724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1 Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.280902 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data" (OuterVolumeSpecName: "config-data") pod "a732412d-8655-4df0-90ba-1bf854b6d8d1" (UID: "a732412d-8655-4df0-90ba-1bf854b6d8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.283188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5c945d84cf-z5v9s"] Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333719 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333745 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b22wp\" (UniqueName: \"kubernetes.io/projected/34294cdd-a18f-4453-8d43-c4d1290e3c59-kube-api-access-b22wp\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.333756 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a732412d-8655-4df0-90ba-1bf854b6d8d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.339814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config" (OuterVolumeSpecName: "config") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.411687 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.427786 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34294cdd-a18f-4453-8d43-c4d1290e3c59" (UID: "34294cdd-a18f-4453-8d43-c4d1290e3c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435818 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435842 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.435851 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/34294cdd-a18f-4453-8d43-c4d1290e3c59-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.573509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.640770 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641157 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641280 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641318 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641449 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.641504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") pod \"6ef40468-5e47-4e34-a641-bfbe7803d480\" (UID: \"6ef40468-5e47-4e34-a641-bfbe7803d480\") " Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.642954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs" (OuterVolumeSpecName: "logs") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.643580 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.650261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts" (OuterVolumeSpecName: "scripts") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.711940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a732412d-8655-4df0-90ba-1bf854b6d8d1","Type":"ContainerDied","Data":"10c0f1e24689995e21992a81df3156a3ac869c2c63cffc5db5d95aae3523ee7b"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.712032 4781 scope.go:117] "RemoveContainer" containerID="e1347c9105935db12132917f879bb29404eb3328ae4b62fde3c6673f55672741" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.712428 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.725124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"724c37c7eaa0223157095c6adcc27778c7a840b1a037151c828e000ed66010e1"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.728878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459" (OuterVolumeSpecName: "kube-api-access-kc459") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "kube-api-access-kc459". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744201 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744249 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744259 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ef40468-5e47-4e34-a641-bfbe7803d480-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.744268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc459\" (UniqueName: \"kubernetes.io/projected/6ef40468-5e47-4e34-a641-bfbe7803d480-kube-api-access-kc459\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767878 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" exitCode=0 Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.767964 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ef40468-5e47-4e34-a641-bfbe7803d480","Type":"ContainerDied","Data":"7d0ca3340d609e18433fc291df1d484624d9e133542d96a4dff1a09c6cf6905a"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.768021 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.804189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5445c56cbd-fmcjz" event={"ID":"34294cdd-a18f-4453-8d43-c4d1290e3c59","Type":"ContainerDied","Data":"35abaddf64ded29044d57543bd49dba6fb7cc622e405ec56e6449b1f79234b7a"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.804294 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5445c56cbd-fmcjz" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.823878 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"02c4875e-e180-4365-a00a-828ab5d95c34","Type":"ContainerStarted","Data":"e39e425f056b37a8d5613652dc4f63f4d1fcd7f56dac5cb4ab465c2f4cfc31b4"} Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.876333 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.704420909 podStartE2EDuration="14.876314524s" podCreationTimestamp="2026-02-27 00:28:09 +0000 UTC" firstStartedPulling="2026-02-27 00:28:10.408724793 +0000 UTC m=+1359.666264337" lastFinishedPulling="2026-02-27 00:28:22.580618398 +0000 UTC m=+1371.838157952" observedRunningTime="2026-02-27 00:28:23.861777071 +0000 UTC m=+1373.119316625" watchObservedRunningTime="2026-02-27 00:28:23.876314524 +0000 UTC m=+1373.133854068" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.933790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (OuterVolumeSpecName: "glance") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:28:23 crc kubenswrapper[4781]: I0227 00:28:23.948124 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.000601 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005159 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data" (OuterVolumeSpecName: "config-data") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005535 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.005713 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5bfae319-10bf-453e-8fc6-7da85b46e549" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549") on node "crc" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.039009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ef40468-5e47-4e34-a641-bfbe7803d480" (UID: "6ef40468-5e47-4e34-a641-bfbe7803d480"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050337 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050647 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050659 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.050668 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40468-5e47-4e34-a641-bfbe7803d480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.102663 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.153556 4781 scope.go:117] "RemoveContainer" containerID="41f966adea97cdc475ad08a86a255ece7a9da3613c19d0f63f5a59a5a293320f" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.154293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155163 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155644 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.155712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs" (OuterVolumeSpecName: "logs") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.156399 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.156416 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb47b6b2-760a-4899-84f6-fdf1bd62a418-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.158499 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts" (OuterVolumeSpecName: "scripts") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.163731 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh" (OuterVolumeSpecName: "kube-api-access-pftsh") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "kube-api-access-pftsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.178964 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.215265 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5445c56cbd-fmcjz"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.227783 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.237954 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.267801 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.279589 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.301736 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302817 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302843 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302861 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302869 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302881 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302887 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302895 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302905 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302931 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302939 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302961 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302967 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.302991 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.302998 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303022 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303027 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303046 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303053 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.303064 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303070 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303607 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="sg-core" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303648 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-central-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303668 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303682 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="proxy-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303703 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303730 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303744 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" containerName="glance-httpd" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303760 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" containerName="ceilometer-notification-agent" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303784 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" containerName="neutron-api" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.303796 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" containerName="glance-log" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.307087 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.309716 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.310564 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b podName:cb47b6b2-760a-4899-84f6-fdf1bd62a418 nodeName:}" failed. No retries permitted until 2026-02-27 00:28:24.810539755 +0000 UTC m=+1374.068079319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.310994 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pftsh\" (UniqueName: \"kubernetes.io/projected/cb47b6b2-760a-4899-84f6-fdf1bd62a418-kube-api-access-pftsh\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.319753 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.319893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.346990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.361581 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.383209 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.386591 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.387536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data" (OuterVolumeSpecName: "config-data") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.394501 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.396772 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403488 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403694 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.403934 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.412771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413006 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413111 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413324 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413402 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413476 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.413599 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.414602 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.414696 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47b6b2-760a-4899-84f6-fdf1bd62a418-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.479428 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:24 crc kubenswrapper[4781]: E0227 00:28:24.481746 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-qkfb8 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516948 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.516999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517019 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517075 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517209 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517434 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.517595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.518026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.521240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.521675 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.522142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.523894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.547348 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"ceilometer-0\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619696 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619758 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.619968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.620022 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.620481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.621991 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/141465f3-d299-4d9c-a74f-0df5c741e325-logs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-scripts\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.626882 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-config-data\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/141465f3-d299-4d9c-a74f-0df5c741e325-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627364 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.627389 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7b96405e17327882846f95b5adf8b290f3f24e0a3e5cf6d272cf20133e6cae4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.642875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcz5t\" (UniqueName: \"kubernetes.io/projected/141465f3-d299-4d9c-a74f-0df5c741e325-kube-api-access-fcz5t\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.678921 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5bfae319-10bf-453e-8fc6-7da85b46e549\") pod \"glance-default-internal-api-0\" (UID: \"141465f3-d299-4d9c-a74f-0df5c741e325\") " pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.776781 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.824268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\" (UID: \"cb47b6b2-760a-4899-84f6-fdf1bd62a418\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.836492 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"1017d28e75e458419a53d6ae77f42ce11681962ac06d8080c95f2799a44c6f64"} Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.839557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cb47b6b2-760a-4899-84f6-fdf1bd62a418","Type":"ContainerDied","Data":"866ec8dc8dd6eea2cbe5498cddcd7820b1b7a00e1ecb5ebf3196c3d57588106d"} Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.839653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.843285 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.854263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.894329 4781 scope.go:117] "RemoveContainer" containerID="576df563fec491fe4b88b02b86a929d4019c459ebde0d69bbe30c74025de222c" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.926972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927078 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") pod \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\" (UID: \"aeac2eb0-b7d3-43c0-91f3-14cc90abcab5\") " Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927440 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.927655 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.928186 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:24 crc kubenswrapper[4781]: I0227 00:28:24.928206 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.322399 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34294cdd-a18f-4453-8d43-c4d1290e3c59" path="/var/lib/kubelet/pods/34294cdd-a18f-4453-8d43-c4d1290e3c59/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.323422 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef40468-5e47-4e34-a641-bfbe7803d480" path="/var/lib/kubelet/pods/6ef40468-5e47-4e34-a641-bfbe7803d480/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.324117 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a732412d-8655-4df0-90ba-1bf854b6d8d1" path="/var/lib/kubelet/pods/a732412d-8655-4df0-90ba-1bf854b6d8d1/volumes" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332014 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8" (OuterVolumeSpecName: "kube-api-access-qkfb8") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "kube-api-access-qkfb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332602 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts" (OuterVolumeSpecName: "scripts") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.332617 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.334703 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.335214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data" (OuterVolumeSpecName: "config-data") pod "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" (UID: "aeac2eb0-b7d3-43c0-91f3-14cc90abcab5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339170 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339217 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339236 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339257 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.339275 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkfb8\" (UniqueName: \"kubernetes.io/projected/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5-kube-api-access-qkfb8\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.351052 4781 scope.go:117] "RemoveContainer" containerID="57f8cc16c7b772f63445194c5db7782e3fcd2bdad4a28c47f2161d0b1572b6c9" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.366717 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (OuterVolumeSpecName: "glance") pod "cb47b6b2-760a-4899-84f6-fdf1bd62a418" (UID: "cb47b6b2-760a-4899-84f6-fdf1bd62a418"). InnerVolumeSpecName "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.413875 4781 scope.go:117] "RemoveContainer" containerID="14d3fcba4ac0c08489e958e9281bb38dcd169375967f24d085bf95a4995989d3" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.441966 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" " Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.507743 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.512967 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.513099 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b") on node "crc" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.526683 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.537842 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.539873 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.542031 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.542241 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.543553 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.548588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.606113 4781 scope.go:117] "RemoveContainer" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645394 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645608 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.645699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.653377 4781 scope.go:117] "RemoveContainer" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.719833 4781 scope.go:117] "RemoveContainer" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: E0227 00:28:25.720325 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": container with ID starting with 16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614 not found: ID does not exist" containerID="16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720366 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614"} err="failed to get container status \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": rpc error: code = NotFound desc = could not find container \"16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614\": container with ID starting with 16a69c06f9cb944df009af4983f6213dea2db781c728ba1df35b1c181223d614 not found: ID does not exist" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720396 4781 scope.go:117] "RemoveContainer" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: E0227 00:28:25.720699 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": container with ID starting with 1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d not found: ID does not exist" containerID="1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720734 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d"} err="failed to get container status \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": rpc error: code = NotFound desc = could not find container \"1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d\": container with ID starting with 1cfa42ca96106627e8f6683c51b74701010592d0677332d334a174eb6459416d not found: ID does not exist" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.720763 4781 scope.go:117] "RemoveContainer" containerID="24cfe8100e51fc567495df6c5f9d60a27bc0381d4315fb54a1ac7e37d2a6bf89" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.747983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.748980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-logs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.749855 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.756248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.756454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.765851 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.766357 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.766386 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d3045414bd1cd74ec61e0394ba262493610c57a87bbc940ef275e8fc1bc2ecf/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.772183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dlt\" (UniqueName: \"kubernetes.io/projected/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-kube-api-access-s8dlt\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.783141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/409aba7a-466d-40a0-b9bd-7dfd8d81ee4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.796441 4781 scope.go:117] "RemoveContainer" containerID="0384541fca62a0c17aeca1e73d81a12f432aa7f744f83cfe8433a7d935539961" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.831942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a315a62d-bbaf-412d-9828-c6f8bcbfab6b\") pod \"glance-default-external-api-0\" (UID: \"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f\") " pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.863279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5c945d84cf-z5v9s" event={"ID":"e8ba5117-540f-448d-aac6-6fde482f5f14","Type":"ContainerStarted","Data":"c1b2f287d85fff96968a0f89edc99d93ebe479cc29c7781601d414abffec3f4f"} Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.864410 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.864451 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.871275 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.901613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.921055 4781 scope.go:117] "RemoveContainer" containerID="fd234a650b390b48c2c62ec04eb6c4e5afa5d6f4b0db395429958fa19cde51f2" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.951032 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5c945d84cf-z5v9s" podStartSLOduration=9.950664744000001 podStartE2EDuration="9.950664744s" podCreationTimestamp="2026-02-27 00:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:25.900388436 +0000 UTC m=+1375.157928000" watchObservedRunningTime="2026-02-27 00:28:25.950664744 +0000 UTC m=+1375.208204298" Feb 27 00:28:25 crc kubenswrapper[4781]: I0227 00:28:25.989200 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.000303 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.007652 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.014585 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.016365 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.023956 4781 scope.go:117] "RemoveContainer" containerID="5483ad8c7ab58752a9371dfb8baad38f002e9c4bc521ec62a86d28db8755aca8" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.024537 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.025766 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056425 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.056478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.084190 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157910 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157933 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157952 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.157981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.158026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.158120 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: W0227 00:28:26.158787 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod141465f3_d299_4d9c_a74f_0df5c741e325.slice/crio-92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba WatchSource:0}: Error finding container 92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba: Status 404 returned error can't find the container with id 92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.161789 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.162955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.164387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.164823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.165072 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.167957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.177296 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"ceilometer-0\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.422767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.631340 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.893663 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"4aa356d4f49dee1250771eb9ae06ae0ace500dad12850e3423a5c80de402db7c"} Feb 27 00:28:26 crc kubenswrapper[4781]: I0227 00:28:26.898105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"92cc791906e49e4f9f91cf90b227b97c529aa4271f277080ed3b0e4a9ce26cba"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.012204 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.339666 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeac2eb0-b7d3-43c0-91f3-14cc90abcab5" path="/var/lib/kubelet/pods/aeac2eb0-b7d3-43c0-91f3-14cc90abcab5/volumes" Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.340548 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb47b6b2-760a-4899-84f6-fdf1bd62a418" path="/var/lib/kubelet/pods/cb47b6b2-760a-4899-84f6-fdf1bd62a418/volumes" Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.936951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"18dc837276b2a2b0958acc6d515dddb9fbb4d20a0b4c318bea5123475d01df79"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.943444 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"da6a90edd57a33fc09015e81e4fdc592ea63eee487b65d1d6bc96eef651c7157"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.943472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"141465f3-d299-4d9c-a74f-0df5c741e325","Type":"ContainerStarted","Data":"a7629cfeae5afc1dd28b6ff30a5da85200d90e321ba1ec6789f35584daece76c"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.950280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"a513fdece7b001c5868cd23c79b7671562961e01d9db2105289006ccfa1d5641"} Feb 27 00:28:27 crc kubenswrapper[4781]: I0227 00:28:27.976961 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.976943187 podStartE2EDuration="3.976943187s" podCreationTimestamp="2026-02-27 00:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:27.96593032 +0000 UTC m=+1377.223469874" watchObservedRunningTime="2026-02-27 00:28:27.976943187 +0000 UTC m=+1377.234482731" Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.972330 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"409aba7a-466d-40a0-b9bd-7dfd8d81ee4f","Type":"ContainerStarted","Data":"81a2f54c4f1eb14da95bab33aa5d3a5c0ed38b81e46bf563e55946cf4995cc42"} Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.981771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} Feb 27 00:28:28 crc kubenswrapper[4781]: I0227 00:28:28.981952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.010109 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.010070039 podStartE2EDuration="4.010070039s" podCreationTimestamp="2026-02-27 00:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:28:29.008106866 +0000 UTC m=+1378.265646430" watchObservedRunningTime="2026-02-27 00:28:29.010070039 +0000 UTC m=+1378.267609593" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.829437 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.831146 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.845596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.852126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.852261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.936642 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.938576 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.951991 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.953658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.953953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954101 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954233 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.954837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.956196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.991503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"nova-api-db-create-kcmlj\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:29 crc kubenswrapper[4781]: I0227 00:28:29.991574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.005129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.013003 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.055907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.055984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.056059 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.056182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.059032 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.075147 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.076618 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.086478 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.088189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"nova-cell0-db-create-lgv6f\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.156785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158241 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.158992 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.161570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.167156 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.180161 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.181820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"nova-api-9245-account-create-update-j6hsh\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.259864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.259949 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.266276 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.281427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.347848 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.349424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.352152 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.359811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.362907 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.364099 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.394402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"nova-cell1-db-create-qx8nd\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.442826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466868 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.466971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.467010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.469270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.500150 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"nova-cell0-141e-account-create-update-msmcr\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.575531 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.575598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.577166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.584826 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.610920 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"nova-cell1-cd3e-account-create-update-dkxt7\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.675307 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.829353 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:28:30 crc kubenswrapper[4781]: I0227 00:28:30.984698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.059180 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerStarted","Data":"e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe"} Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.070086 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.080345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.090606 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerStarted","Data":"35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b"} Feb 27 00:28:31 crc kubenswrapper[4781]: W0227 00:28:31.097665 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7468389a_cc9b_404c_9414_4d81f3b1a7e5.slice/crio-5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1 WatchSource:0}: Error finding container 5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1: Status 404 returned error can't find the container with id 5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1 Feb 27 00:28:31 crc kubenswrapper[4781]: W0227 00:28:31.102379 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b9af6a0_49e8_462c_80d6_df8a3d3bd4ce.slice/crio-8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998 WatchSource:0}: Error finding container 8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998: Status 404 returned error can't find the container with id 8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998 Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.286200 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.300349 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.434962 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 00:28:31 crc kubenswrapper[4781]: I0227 00:28:31.443737 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.109792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerStarted","Data":"f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.114880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.115934 4781 generic.go:334] "Generic (PLEG): container finished" podID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerID="24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.116040 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerDied","Data":"24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.116074 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerStarted","Data":"8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.130396 4781 generic.go:334] "Generic (PLEG): container finished" podID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerID="e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.130479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerDied","Data":"e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.131959 4781 generic.go:334] "Generic (PLEG): container finished" podID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerID="12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.132071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerDied","Data":"12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.132123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerStarted","Data":"5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.133657 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerStarted","Data":"3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137299 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerID="ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd" exitCode=0 Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerDied","Data":"ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd"} Feb 27 00:28:32 crc kubenswrapper[4781]: I0227 00:28:32.137440 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5c945d84cf-z5v9s" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.148269 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerID="feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228" exitCode=0 Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.148358 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerDied","Data":"feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.151657 4781 generic.go:334] "Generic (PLEG): container finished" podID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerID="a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89" exitCode=0 Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.151735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerDied","Data":"a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.154771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerStarted","Data":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.216124 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.46270243 podStartE2EDuration="8.216102889s" podCreationTimestamp="2026-02-27 00:28:25 +0000 UTC" firstStartedPulling="2026-02-27 00:28:27.052133282 +0000 UTC m=+1376.309672836" lastFinishedPulling="2026-02-27 00:28:31.805533741 +0000 UTC m=+1381.063073295" observedRunningTime="2026-02-27 00:28:33.208811742 +0000 UTC m=+1382.466351316" watchObservedRunningTime="2026-02-27 00:28:33.216102889 +0000 UTC m=+1382.473642443" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.738455 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.849437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") pod \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.849503 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") pod \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\" (UID: \"7f0e335a-e4a1-48ee-b470-a6277acc5dae\") " Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.850876 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f0e335a-e4a1-48ee-b470-a6277acc5dae" (UID: "7f0e335a-e4a1-48ee-b470-a6277acc5dae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.856028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86" (OuterVolumeSpecName: "kube-api-access-9gw86") pod "7f0e335a-e4a1-48ee-b470-a6277acc5dae" (UID: "7f0e335a-e4a1-48ee-b470-a6277acc5dae"). InnerVolumeSpecName "kube-api-access-9gw86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.951994 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gw86\" (UniqueName: \"kubernetes.io/projected/7f0e335a-e4a1-48ee-b470-a6277acc5dae-kube-api-access-9gw86\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:33 crc kubenswrapper[4781]: I0227 00:28:33.952351 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f0e335a-e4a1-48ee-b470-a6277acc5dae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.022241 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.031269 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.036576 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") pod \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154711 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") pod \"6795d880-5f00-4be4-9c67-6f8a251550cb\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") pod \"6795d880-5f00-4be4-9c67-6f8a251550cb\" (UID: \"6795d880-5f00-4be4-9c67-6f8a251550cb\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154750 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") pod \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") pod \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\" (UID: \"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.154986 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") pod \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\" (UID: \"7468389a-cc9b-404c-9414-4d81f3b1a7e5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155349 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6795d880-5f00-4be4-9c67-6f8a251550cb" (UID: "6795d880-5f00-4be4-9c67-6f8a251550cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" (UID: "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.155809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7468389a-cc9b-404c-9414-4d81f3b1a7e5" (UID: "7468389a-cc9b-404c-9414-4d81f3b1a7e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160524 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7" (OuterVolumeSpecName: "kube-api-access-vt8x7") pod "6795d880-5f00-4be4-9c67-6f8a251550cb" (UID: "6795d880-5f00-4be4-9c67-6f8a251550cb"). InnerVolumeSpecName "kube-api-access-vt8x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql" (OuterVolumeSpecName: "kube-api-access-sc7ql") pod "7468389a-cc9b-404c-9414-4d81f3b1a7e5" (UID: "7468389a-cc9b-404c-9414-4d81f3b1a7e5"). InnerVolumeSpecName "kube-api-access-sc7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.160727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg" (OuterVolumeSpecName: "kube-api-access-nb5sg") pod "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" (UID: "5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce"). InnerVolumeSpecName "kube-api-access-nb5sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lgv6f" event={"ID":"5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce","Type":"ContainerDied","Data":"8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213709 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff62ce7ed0f99a13dbdc14aa1d4d0850ddd0de4549c511924fdd434d0ff9998" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.213787 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lgv6f" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222811 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9245-account-create-update-j6hsh" event={"ID":"6795d880-5f00-4be4-9c67-6f8a251550cb","Type":"ContainerDied","Data":"e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222860 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6bf1e81e83109b20ee78a4698c5f46cec050fb71165f48f7b679ef30e434cbe" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.222890 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9245-account-create-update-j6hsh" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224847 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qx8nd" event={"ID":"7468389a-cc9b-404c-9414-4d81f3b1a7e5","Type":"ContainerDied","Data":"5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224891 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5576b81dd5904dbe30bed1665d83072721616cae57d7d91a0cf5669aa863c1c1" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.224971 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qx8nd" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232380 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kcmlj" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232693 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kcmlj" event={"ID":"7f0e335a-e4a1-48ee-b470-a6277acc5dae","Type":"ContainerDied","Data":"35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b"} Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232737 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c016eb10e43ba219fe0c2064520734a36f7363a214ff58f2ffde62c09da07b" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.232980 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257232 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6795d880-5f00-4be4-9c67-6f8a251550cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257258 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt8x7\" (UniqueName: \"kubernetes.io/projected/6795d880-5f00-4be4-9c67-6f8a251550cb-kube-api-access-vt8x7\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257269 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7ql\" (UniqueName: \"kubernetes.io/projected/7468389a-cc9b-404c-9414-4d81f3b1a7e5-kube-api-access-sc7ql\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257278 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257287 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7468389a-cc9b-404c-9414-4d81f3b1a7e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.257296 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb5sg\" (UniqueName: \"kubernetes.io/projected/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce-kube-api-access-nb5sg\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.590482 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.722450 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.766422 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") pod \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.766570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") pod \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\" (UID: \"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.767440 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" (UID: "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.777989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.779169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.779949 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85" (OuterVolumeSpecName: "kube-api-access-cqc85") pod "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" (UID: "c2f8e017-da89-4ce0-a5b7-2339b2cf18a5"). InnerVolumeSpecName "kube-api-access-cqc85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.821227 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.833316 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869002 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") pod \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") pod \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\" (UID: \"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0\") " Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.869397 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" (UID: "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.870978 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.871003 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqc85\" (UniqueName: \"kubernetes.io/projected/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5-kube-api-access-cqc85\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.871020 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.878939 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c" (OuterVolumeSpecName: "kube-api-access-gcd4c") pod "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" (UID: "2b4dbafa-fefb-4947-8d71-f7b0057a2ba0"). InnerVolumeSpecName "kube-api-access-gcd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:34 crc kubenswrapper[4781]: I0227 00:28:34.972825 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcd4c\" (UniqueName: \"kubernetes.io/projected/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0-kube-api-access-gcd4c\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.244206 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-141e-account-create-update-msmcr" event={"ID":"c2f8e017-da89-4ce0-a5b7-2339b2cf18a5","Type":"ContainerDied","Data":"3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585"} Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.244257 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0c43c1a4e5a5291167e1fa7fd5751b434133130a3345ffad218933d1ced585" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.245055 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-141e-account-create-update-msmcr" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.245994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" event={"ID":"2b4dbafa-fefb-4947-8d71-f7b0057a2ba0","Type":"ContainerDied","Data":"f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8"} Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246031 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ed84118ee5cf9c2ed24eb79c583f0f931f25971746f3f5ae7f1d86952188c8" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246037 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3e-account-create-update-dkxt7" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246603 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.246735 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.604840 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.902667 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.902930 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.937341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:28:35 crc kubenswrapper[4781]: I0227 00:28:35.955808 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258528 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" containerID="cri-o://3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258620 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" containerID="cri-o://5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258641 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" containerID="cri-o://e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" gracePeriod=30 Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 00:28:36 crc kubenswrapper[4781]: I0227 00:28:36.258860 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" containerID="cri-o://f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" gracePeriod=30 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.191600 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270299 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270333 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" exitCode=2 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270340 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270347 4781 generic.go:334] "Generic (PLEG): container finished" podID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" exitCode=0 Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270400 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"546554c7-b0b0-4363-b1f8-6f83d43562cc","Type":"ContainerDied","Data":"a513fdece7b001c5868cd23c79b7671562961e01d9db2105289006ccfa1d5641"} Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.270486 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.301244 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.301363 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.303586 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.308941 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320283 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320331 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320427 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320454 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320479 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320527 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.320564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") pod \"546554c7-b0b0-4363-b1f8-6f83d43562cc\" (UID: \"546554c7-b0b0-4363-b1f8-6f83d43562cc\") " Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.323284 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.327017 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.331065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts" (OuterVolumeSpecName: "scripts") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.339858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2" (OuterVolumeSpecName: "kube-api-access-s4sp2") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "kube-api-access-s4sp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.347841 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.359835 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426181 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426490 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/546554c7-b0b0-4363-b1f8-6f83d43562cc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426499 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4sp2\" (UniqueName: \"kubernetes.io/projected/546554c7-b0b0-4363-b1f8-6f83d43562cc-kube-api-access-s4sp2\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426509 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.426519 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.478912 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.527565 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.540907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data" (OuterVolumeSpecName: "config-data") pod "546554c7-b0b0-4363-b1f8-6f83d43562cc" (UID: "546554c7-b0b0-4363-b1f8-6f83d43562cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.624988 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.631048 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546554c7-b0b0-4363-b1f8-6f83d43562cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.632738 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.657041 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.665070 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.668210 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.668255 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.668280 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675248 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675719 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675740 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675752 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675758 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675770 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675777 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675803 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675811 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675845 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675850 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675860 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675866 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675879 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675897 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.675904 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.675910 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676124 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676141 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="proxy-httpd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676174 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="sg-core" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676183 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" containerName="mariadb-database-create" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676199 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676212 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676226 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" containerName="mariadb-account-create-update" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676238 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-notification-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676246 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" containerName="ceilometer-central-agent" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.676433 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676468 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.676494 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.678021 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.681743 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.681782 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.681804 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: E0227 00:28:37.686766 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.686803 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.686825 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.690701 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.690725 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.694874 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.694924 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.700064 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.700113 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.703857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.704555 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.705802 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.708920 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.708956 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.711774 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.711800 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.713313 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.713349 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.716922 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.716958 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717281 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717298 4781 scope.go:117] "RemoveContainer" containerID="5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717546 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7"} err="failed to get container status \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": rpc error: code = NotFound desc = could not find container \"5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7\": container with ID starting with 5eeab830d47fcb8e4e56a2b8e1530eb9e57b6d9e4c8ca2f1cf8952c1ae58c3b7 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717564 4781 scope.go:117] "RemoveContainer" containerID="e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717838 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd"} err="failed to get container status \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": rpc error: code = NotFound desc = could not find container \"e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd\": container with ID starting with e9bba1c2ec83fb494ff172071d8a39db2243c79c966493ee50dd1ce3c8a9bbdd not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.717853 4781 scope.go:117] "RemoveContainer" containerID="f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718102 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8"} err="failed to get container status \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": rpc error: code = NotFound desc = could not find container \"f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8\": container with ID starting with f83df6ccb7b764b7be9db6f0ea888f549b67d913c69001f9816cbb1fb43c56a8 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718117 4781 scope.go:117] "RemoveContainer" containerID="3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.718339 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486"} err="failed to get container status \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": rpc error: code = NotFound desc = could not find container \"3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486\": container with ID starting with 3592571ffa2aa0f4dea061cb15c09f69d2853d3415c4e44ec3f5d4f8b5a0d486 not found: ID does not exist" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.834179 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.834252 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835047 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835195 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835306 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.835622 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.937928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.937993 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938123 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938149 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.938866 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.944440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.944830 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.946184 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.948293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.960281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"ceilometer-0\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " pod="openstack/ceilometer-0" Feb 27 00:28:37 crc kubenswrapper[4781]: I0227 00:28:37.995185 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.004416 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.040438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d64c6bb46-jcp5p" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.134576 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.134818 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c479bbf8-lkpd7" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" containerID="cri-o://412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" gracePeriod=30 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.135224 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c479bbf8-lkpd7" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" containerID="cri-o://6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" gracePeriod=30 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.292280 4781 generic.go:334] "Generic (PLEG): container finished" podID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" exitCode=143 Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.292697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.549151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.580954 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.651184 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.651276 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 00:28:38 crc kubenswrapper[4781]: I0227 00:28:38.797266 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.304329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6"} Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.304649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246"} Feb 27 00:28:39 crc kubenswrapper[4781]: I0227 00:28:39.321692 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546554c7-b0b0-4363-b1f8-6f83d43562cc" path="/var/lib/kubelet/pods/546554c7-b0b0-4363-b1f8-6f83d43562cc/volumes" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.317083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9"} Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.442749 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.448521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.451613 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.451913 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.455030 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lsptr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.463311 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607642 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.607883 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710114 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710162 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.710245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.717356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.717702 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.721065 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.741203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"nova-cell0-conductor-db-sync-9cntr\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:40 crc kubenswrapper[4781]: I0227 00:28:40.850963 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.343521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4"} Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.420565 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.756220 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832914 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.832988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833072 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833092 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.833118 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") pod \"33c297e1-af3e-46d6-9738-8e6833deaf02\" (UID: \"33c297e1-af3e-46d6-9738-8e6833deaf02\") " Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.834672 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs" (OuterVolumeSpecName: "logs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.840548 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts" (OuterVolumeSpecName: "scripts") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.840832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4" (OuterVolumeSpecName: "kube-api-access-jnnv4") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "kube-api-access-jnnv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.903888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.935897 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnnv4\" (UniqueName: \"kubernetes.io/projected/33c297e1-af3e-46d6-9738-8e6833deaf02-kube-api-access-jnnv4\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936040 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c297e1-af3e-46d6-9738-8e6833deaf02-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936063 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.936072 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:41 crc kubenswrapper[4781]: I0227 00:28:41.941873 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data" (OuterVolumeSpecName: "config-data") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.024304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.026748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "33c297e1-af3e-46d6-9738-8e6833deaf02" (UID: "33c297e1-af3e-46d6-9738-8e6833deaf02"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039201 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039236 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.039280 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c297e1-af3e-46d6-9738-8e6833deaf02-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.384882 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerStarted","Data":"b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387218 4781 generic.go:334] "Generic (PLEG): container finished" podID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" exitCode=0 Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c479bbf8-lkpd7" event={"ID":"33c297e1-af3e-46d6-9738-8e6833deaf02","Type":"ContainerDied","Data":"21b15cb407945a01adc26829ab99f15cd9c656e66d81cf610b3118b8b9526261"} Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387278 4781 scope.go:117] "RemoveContainer" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.387400 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c479bbf8-lkpd7" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.413172 4781 scope.go:117] "RemoveContainer" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.431688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.449628 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76c479bbf8-lkpd7"] Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.456654 4781 scope.go:117] "RemoveContainer" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: E0227 00:28:42.460788 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": container with ID starting with 6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68 not found: ID does not exist" containerID="6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.460845 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68"} err="failed to get container status \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": rpc error: code = NotFound desc = could not find container \"6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68\": container with ID starting with 6d2f108203b1ec74c18c2d296290e2f313baef1cea6d95e946e71ef00a537f68 not found: ID does not exist" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.460869 4781 scope.go:117] "RemoveContainer" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: E0227 00:28:42.461364 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": container with ID starting with 412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8 not found: ID does not exist" containerID="412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8" Feb 27 00:28:42 crc kubenswrapper[4781]: I0227 00:28:42.461412 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8"} err="failed to get container status \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": rpc error: code = NotFound desc = could not find container \"412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8\": container with ID starting with 412d907729275d47b2b33bfdb313eef0737f0bef5101c541d8006af94fa96bc8 not found: ID does not exist" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.023164 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.323950 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" path="/var/lib/kubelet/pods/33c297e1-af3e-46d6-9738-8e6833deaf02/volumes" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerStarted","Data":"cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3"} Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412416 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" containerID="cri-o://7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.412820 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413212 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" containerID="cri-o://cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413283 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" containerID="cri-o://9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" gracePeriod=30 Feb 27 00:28:43 crc kubenswrapper[4781]: I0227 00:28:43.413341 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" containerID="cri-o://16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" gracePeriod=30 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.424789 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" exitCode=2 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.425122 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" exitCode=0 Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.424881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4"} Feb 27 00:28:44 crc kubenswrapper[4781]: I0227 00:28:44.425169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9"} Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.344008 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.822551053 podStartE2EDuration="14.343982805s" podCreationTimestamp="2026-02-27 00:28:37 +0000 UTC" firstStartedPulling="2026-02-27 00:28:38.551308606 +0000 UTC m=+1387.808848150" lastFinishedPulling="2026-02-27 00:28:43.072740348 +0000 UTC m=+1392.330279902" observedRunningTime="2026-02-27 00:28:43.441425818 +0000 UTC m=+1392.698965372" watchObservedRunningTime="2026-02-27 00:28:51.343982805 +0000 UTC m=+1400.601522369" Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.502050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerStarted","Data":"a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150"} Feb 27 00:28:51 crc kubenswrapper[4781]: I0227 00:28:51.519524 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9cntr" podStartSLOduration=2.37584504 podStartE2EDuration="11.519507997s" podCreationTimestamp="2026-02-27 00:28:40 +0000 UTC" firstStartedPulling="2026-02-27 00:28:41.426818341 +0000 UTC m=+1390.684357895" lastFinishedPulling="2026-02-27 00:28:50.570481298 +0000 UTC m=+1399.828020852" observedRunningTime="2026-02-27 00:28:51.515736415 +0000 UTC m=+1400.773275969" watchObservedRunningTime="2026-02-27 00:28:51.519507997 +0000 UTC m=+1400.777047551" Feb 27 00:28:53 crc kubenswrapper[4781]: I0227 00:28:53.525820 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" exitCode=0 Feb 27 00:28:53 crc kubenswrapper[4781]: I0227 00:28:53.525874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6"} Feb 27 00:29:01 crc kubenswrapper[4781]: I0227 00:29:01.622373 4781 generic.go:334] "Generic (PLEG): container finished" podID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerID="a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150" exitCode=0 Feb 27 00:29:01 crc kubenswrapper[4781]: I0227 00:29:01.622479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerDied","Data":"a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150"} Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.093459 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195530 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.195594 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") pod \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\" (UID: \"d71a5c1e-7953-4acf-813a-0d96d4992d1f\") " Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.201803 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l" (OuterVolumeSpecName: "kube-api-access-ksn6l") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "kube-api-access-ksn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.201943 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts" (OuterVolumeSpecName: "scripts") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.226678 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.227715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data" (OuterVolumeSpecName: "config-data") pod "d71a5c1e-7953-4acf-813a-0d96d4992d1f" (UID: "d71a5c1e-7953-4acf-813a-0d96d4992d1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297249 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksn6l\" (UniqueName: \"kubernetes.io/projected/d71a5c1e-7953-4acf-813a-0d96d4992d1f-kube-api-access-ksn6l\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297277 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297287 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.297295 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71a5c1e-7953-4acf-813a-0d96d4992d1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643454 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9cntr" event={"ID":"d71a5c1e-7953-4acf-813a-0d96d4992d1f","Type":"ContainerDied","Data":"b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7"} Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643792 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b913bb52004e54e9c0de1dc5c1250761b05eab25edf15ed18eac691db4593cf7" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.643519 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9cntr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.745830 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746312 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746329 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746350 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746357 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: E0227 00:29:03.746376 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746382 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746569 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-log" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746589 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c297e1-af3e-46d6-9738-8e6833deaf02" containerName="placement-api" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.746612 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" containerName="nova-cell0-conductor-db-sync" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.747355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.750019 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.754972 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lsptr" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.756480 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.909824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.910109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:03 crc kubenswrapper[4781]: I0227 00:29:03.910259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.012328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.018618 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.018711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7503d0a7-eca6-4d15-9538-9cded970acc2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.047253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvr6\" (UniqueName: \"kubernetes.io/projected/7503d0a7-eca6-4d15-9538-9cded970acc2-kube-api-access-5fvr6\") pod \"nova-cell0-conductor-0\" (UID: \"7503d0a7-eca6-4d15-9538-9cded970acc2\") " pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.064123 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.530868 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 00:29:04 crc kubenswrapper[4781]: I0227 00:29:04.657825 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7503d0a7-eca6-4d15-9538-9cded970acc2","Type":"ContainerStarted","Data":"51ce2d4968b5aa25bc43cb9a6c14264106f6edeb7b625171dbf676e0c936523b"} Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.669167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7503d0a7-eca6-4d15-9538-9cded970acc2","Type":"ContainerStarted","Data":"829d4073c292cda6b13ce3bcf1e5167716db1791de3771bbdb28e0917b02ba8b"} Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.669712 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:05 crc kubenswrapper[4781]: I0227 00:29:05.699915 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.699882075 podStartE2EDuration="2.699882075s" podCreationTimestamp="2026-02-27 00:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:05.683791421 +0000 UTC m=+1414.941330995" watchObservedRunningTime="2026-02-27 00:29:05.699882075 +0000 UTC m=+1414.957421669" Feb 27 00:29:08 crc kubenswrapper[4781]: I0227 00:29:08.009229 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.099926 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.622066 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.623658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.626275 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.626686 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.632177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733663 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.733758 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.835848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.847356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.848245 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.866984 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.869923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.871770 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.876577 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.884714 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.887467 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.898097 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.924720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"nova-cell0-cell-mapping-qjkwv\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.929703 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.946219 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:09 crc kubenswrapper[4781]: I0227 00:29:09.948094 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.025762 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.028817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.031348 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.062833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063222 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.063454 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.166200 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168779 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.168988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169239 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.169992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.170415 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.176073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.183938 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.189350 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.203560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"nova-metadata-0\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.208061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.210347 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.210646 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"nova-api-0\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.211843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.216222 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.251000 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.270848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.270903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.271005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.276304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.278096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.279923 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.280192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.299832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"nova-scheduler-0\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.320151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373729 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.373967 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374152 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.374239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.462427 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476250 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476391 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.476407 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.478004 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.479239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.479757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.481688 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.486974 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.490311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.492280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.504025 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.511693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"dnsmasq-dns-78cd565959-l4cw7\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.519154 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.525248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"nova-cell1-novncproxy-0\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.555948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.609456 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.652833 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:29:10 crc kubenswrapper[4781]: I0227 00:29:10.832419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerStarted","Data":"6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.367374 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.369173 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.373769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.374039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.380308 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500868 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.500950 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.501036 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.529554 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.565363 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.603918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.604187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.619979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.637287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: W0227 00:29:11.637394 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d780ba2_9829_430e_9a56_0b5b052bfbb7.slice/crio-c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b WatchSource:0}: Error finding container c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b: Status 404 returned error can't find the container with id c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.638269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.638839 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"nova-cell1-conductor-db-sync-tg9k8\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.639721 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.656188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.709197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.874327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerStarted","Data":"c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.879260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerStarted","Data":"c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.893812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"3a3cfa9569cf1e101c985b875f586bf5df5c1e9c190016bf01cb0461f1a4b9c8"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.902578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"10dbcf9aa331b09eb162dae4f7eb67ae5890ce7956c09aaa8725da5e211a8996"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.904153 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerStarted","Data":"5ecdf1c41abef4437c80f6d85c04db80a9d6858579c757ef6823795e81d59b23"} Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.912533 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qjkwv" podStartSLOduration=2.912511527 podStartE2EDuration="2.912511527s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:11.902702703 +0000 UTC m=+1421.160242257" watchObservedRunningTime="2026-02-27 00:29:11.912511527 +0000 UTC m=+1421.170051091" Feb 27 00:29:11 crc kubenswrapper[4781]: I0227 00:29:11.988900 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:29:12 crc kubenswrapper[4781]: W0227 00:29:12.288054 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb607db2c_2aa3_48f0_9cd8_c5461797431c.slice/crio-34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8 WatchSource:0}: Error finding container 34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8: Status 404 returned error can't find the container with id 34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8 Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.290013 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927336 4781 generic.go:334] "Generic (PLEG): container finished" podID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerID="363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7" exitCode=0 Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927685 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.927747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerStarted","Data":"e0e61b6d097a768cedf938a2051e02fe6b26d59774f1dfea50ad4f92d0779d0a"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.938276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerStarted","Data":"39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.938321 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerStarted","Data":"34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8"} Feb 27 00:29:12 crc kubenswrapper[4781]: I0227 00:29:12.996049 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" podStartSLOduration=1.9960263889999998 podStartE2EDuration="1.996026389s" podCreationTimestamp="2026-02-27 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:12.97495893 +0000 UTC m=+1422.232498484" watchObservedRunningTime="2026-02-27 00:29:12.996026389 +0000 UTC m=+1422.253565943" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.634570 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.656331 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:13 crc kubenswrapper[4781]: E0227 00:29:13.714533 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c0d1328_b565_4c9e_a9dc_e7b863568260.slice/crio-conmon-cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.963054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerStarted","Data":"26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249"} Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.963690 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.975033 4781 generic.go:334] "Generic (PLEG): container finished" podID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerID="cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" exitCode=137 Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.975266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3"} Feb 27 00:29:13 crc kubenswrapper[4781]: I0227 00:29:13.991370 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podStartSLOduration=3.9913453089999997 podStartE2EDuration="3.991345309s" podCreationTimestamp="2026-02-27 00:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:13.980686371 +0000 UTC m=+1423.238225925" watchObservedRunningTime="2026-02-27 00:29:13.991345309 +0000 UTC m=+1423.248884863" Feb 27 00:29:14 crc kubenswrapper[4781]: I0227 00:29:14.990539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c0d1328-b565-4c9e-a9dc-e7b863568260","Type":"ContainerDied","Data":"72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246"} Feb 27 00:29:14 crc kubenswrapper[4781]: I0227 00:29:14.990600 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f31190f456d76c84b55f85362321bc4ca382df7c3a1c86e9e23616be0d7246" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.087721 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.220823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.220995 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221100 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221127 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221163 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221226 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") pod \"6c0d1328-b565-4c9e-a9dc-e7b863568260\" (UID: \"6c0d1328-b565-4c9e-a9dc-e7b863568260\") " Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.221585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.222190 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.222214 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.227937 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts" (OuterVolumeSpecName: "scripts") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.232704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n" (OuterVolumeSpecName: "kube-api-access-2hs5n") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "kube-api-access-2hs5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.274267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324422 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c0d1328-b565-4c9e-a9dc-e7b863568260-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324457 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324467 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hs5n\" (UniqueName: \"kubernetes.io/projected/6c0d1328-b565-4c9e-a9dc-e7b863568260-kube-api-access-2hs5n\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.324477 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.325445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.365383 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data" (OuterVolumeSpecName: "config-data") pod "6c0d1328-b565-4c9e-a9dc-e7b863568260" (UID: "6c0d1328-b565-4c9e-a9dc-e7b863568260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.426882 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:15 crc kubenswrapper[4781]: I0227 00:29:15.427236 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0d1328-b565-4c9e-a9dc-e7b863568260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.003073 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.042653 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.054108 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.079942 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080401 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080427 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080446 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080457 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080500 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080511 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: E0227 00:29:16.080520 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080528 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080822 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-central-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080849 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="sg-core" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080867 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="ceilometer-notification-agent" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.080891 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" containerName="proxy-httpd" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.083204 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.088015 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.088298 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.089596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245854 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.245880 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.246529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348565 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348641 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.348971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.349194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.349596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.354056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.355233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.355649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.359126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.376772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"ceilometer-0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " pod="openstack/ceilometer-0" Feb 27 00:29:16 crc kubenswrapper[4781]: I0227 00:29:16.436752 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.024118 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.025863 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.027920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerStarted","Data":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.040732 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerStarted","Data":"7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.040804 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" gracePeriod=30 Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.044973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.050884 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.353703748 podStartE2EDuration="8.050865616s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.649971965 +0000 UTC m=+1420.907511519" lastFinishedPulling="2026-02-27 00:29:16.347133823 +0000 UTC m=+1425.604673387" observedRunningTime="2026-02-27 00:29:17.048930093 +0000 UTC m=+1426.306469647" watchObservedRunningTime="2026-02-27 00:29:17.050865616 +0000 UTC m=+1426.308405170" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.077935 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.377793597 podStartE2EDuration="8.077913306s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.649585924 +0000 UTC m=+1420.907125478" lastFinishedPulling="2026-02-27 00:29:16.349705623 +0000 UTC m=+1425.607245187" observedRunningTime="2026-02-27 00:29:17.070891427 +0000 UTC m=+1426.328430981" watchObservedRunningTime="2026-02-27 00:29:17.077913306 +0000 UTC m=+1426.335452860" Feb 27 00:29:17 crc kubenswrapper[4781]: I0227 00:29:17.337170 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0d1328-b565-4c9e-a9dc-e7b863568260" path="/var/lib/kubelet/pods/6c0d1328-b565-4c9e-a9dc-e7b863568260/volumes" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.057078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerStarted","Data":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerStarted","Data":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059395 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" containerID="cri-o://4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" gracePeriod=30 Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.059443 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" containerID="cri-o://283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" gracePeriod=30 Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.061319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d"} Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.084527 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.07967191 podStartE2EDuration="9.084511021s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.567221409 +0000 UTC m=+1420.824760963" lastFinishedPulling="2026-02-27 00:29:16.57206052 +0000 UTC m=+1425.829600074" observedRunningTime="2026-02-27 00:29:18.083526474 +0000 UTC m=+1427.341066028" watchObservedRunningTime="2026-02-27 00:29:18.084511021 +0000 UTC m=+1427.342050575" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.105515 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.092074905 podStartE2EDuration="9.105493248s" podCreationTimestamp="2026-02-27 00:29:09 +0000 UTC" firstStartedPulling="2026-02-27 00:29:11.523817446 +0000 UTC m=+1420.781357000" lastFinishedPulling="2026-02-27 00:29:16.537235789 +0000 UTC m=+1425.794775343" observedRunningTime="2026-02-27 00:29:18.099321511 +0000 UTC m=+1427.356861065" watchObservedRunningTime="2026-02-27 00:29:18.105493248 +0000 UTC m=+1427.363032802" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.811364 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.909784 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.909843 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910098 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") pod \"809e4ffe-9885-43b8-bb34-b748437f1bb9\" (UID: \"809e4ffe-9885-43b8-bb34-b748437f1bb9\") " Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.910886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs" (OuterVolumeSpecName: "logs") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.919920 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq" (OuterVolumeSpecName: "kube-api-access-qrbkq") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "kube-api-access-qrbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.943045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:18 crc kubenswrapper[4781]: I0227 00:29:18.969726 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data" (OuterVolumeSpecName: "config-data") pod "809e4ffe-9885-43b8-bb34-b748437f1bb9" (UID: "809e4ffe-9885-43b8-bb34-b748437f1bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012941 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrbkq\" (UniqueName: \"kubernetes.io/projected/809e4ffe-9885-43b8-bb34-b748437f1bb9-kube-api-access-qrbkq\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012975 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012984 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809e4ffe-9885-43b8-bb34-b748437f1bb9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.012994 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809e4ffe-9885-43b8-bb34-b748437f1bb9-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098074 4781 generic.go:334] "Generic (PLEG): container finished" podID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" exitCode=0 Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098487 4781 generic.go:334] "Generic (PLEG): container finished" podID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" exitCode=143 Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098318 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098600 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809e4ffe-9885-43b8-bb34-b748437f1bb9","Type":"ContainerDied","Data":"3a3cfa9569cf1e101c985b875f586bf5df5c1e9c190016bf01cb0461f1a4b9c8"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098620 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.098405 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.112148 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1"} Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.154811 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.159846 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.183888 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.185207 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.185859 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.185959 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.186048 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186107 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186355 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-log" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.186440 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" containerName="nova-metadata-metadata" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.187587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.192910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.193167 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.208294 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.210913 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.211338 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211479 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} err="failed to get container status \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211584 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: E0227 00:29:19.211897 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.211988 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} err="failed to get container status \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212058 4781 scope.go:117] "RemoveContainer" containerID="283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212288 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6"} err="failed to get container status \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": rpc error: code = NotFound desc = could not find container \"283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6\": container with ID starting with 283d36a4627d5c4e35af3a990333f23576d5448286909a73b669f8cd670a91f6 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212378 4781 scope.go:117] "RemoveContainer" containerID="4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.212647 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399"} err="failed to get container status \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": rpc error: code = NotFound desc = could not find container \"4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399\": container with ID starting with 4862dded72f38d6b7705a60af348a527f1e3b310a0f51a4b13c54b456f373399 not found: ID does not exist" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319395 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319459 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319529 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.319601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.327820 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809e4ffe-9885-43b8-bb34-b748437f1bb9" path="/var/lib/kubelet/pods/809e4ffe-9885-43b8-bb34-b748437f1bb9/volumes" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421482 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.421953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.422911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.425901 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.426486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.434711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.437722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"nova-metadata-0\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " pod="openstack/nova-metadata-0" Feb 27 00:29:19 crc kubenswrapper[4781]: I0227 00:29:19.526153 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.040942 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:20 crc kubenswrapper[4781]: W0227 00:29:20.043061 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed68203_3ac6_4133_92d9_175f234d5229.slice/crio-1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7 WatchSource:0}: Error finding container 1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7: Status 404 returned error can't find the container with id 1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7 Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.133302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.135923 4781 generic.go:334] "Generic (PLEG): container finished" podID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerID="c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69" exitCode=0 Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.135974 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerDied","Data":"c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.147275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.147315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19"} Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.464081 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.464132 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.521542 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.521596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.556724 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.561069 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.618343 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.715685 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:20 crc kubenswrapper[4781]: I0227 00:29:20.715954 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" containerID="cri-o://bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" gracePeriod=10 Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.164901 4781 generic.go:334] "Generic (PLEG): container finished" podID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerID="bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" exitCode=0 Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.164988 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.173690 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.174049 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerStarted","Data":"bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03"} Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.202136 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.202098776 podStartE2EDuration="2.202098776s" podCreationTimestamp="2026-02-27 00:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:21.197860642 +0000 UTC m=+1430.455400216" watchObservedRunningTime="2026-02-27 00:29:21.202098776 +0000 UTC m=+1430.459638330" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.253153 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.431682 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.471907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.471981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.472350 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") pod \"39b2afc0-76d7-48e9-8528-f88e3ba22955\" (UID: \"39b2afc0-76d7-48e9-8528-f88e3ba22955\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.481936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7" (OuterVolumeSpecName: "kube-api-access-w4nk7") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "kube-api-access-w4nk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.549305 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.549433 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.575317 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4nk7\" (UniqueName: \"kubernetes.io/projected/39b2afc0-76d7-48e9-8528-f88e3ba22955-kube-api-access-w4nk7\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.579320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.591463 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.606149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config" (OuterVolumeSpecName: "config") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.621500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.623398 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39b2afc0-76d7-48e9-8528-f88e3ba22955" (UID: "39b2afc0-76d7-48e9-8528-f88e3ba22955"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679895 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679928 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679938 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679948 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.679977 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b2afc0-76d7-48e9-8528-f88e3ba22955-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.810239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883296 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883382 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.883580 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") pod \"cd521dc6-4126-4c51-8634-66db8ba1412e\" (UID: \"cd521dc6-4126-4c51-8634-66db8ba1412e\") " Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.887641 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr" (OuterVolumeSpecName: "kube-api-access-f7flr") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "kube-api-access-f7flr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.899267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts" (OuterVolumeSpecName: "scripts") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.920328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.942834 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data" (OuterVolumeSpecName: "config-data") pod "cd521dc6-4126-4c51-8634-66db8ba1412e" (UID: "cd521dc6-4126-4c51-8634-66db8ba1412e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986284 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986315 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986324 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd521dc6-4126-4c51-8634-66db8ba1412e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:21 crc kubenswrapper[4781]: I0227 00:29:21.986333 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7flr\" (UniqueName: \"kubernetes.io/projected/cd521dc6-4126-4c51-8634-66db8ba1412e-kube-api-access-f7flr\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204734 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qjkwv" event={"ID":"cd521dc6-4126-4c51-8634-66db8ba1412e","Type":"ContainerDied","Data":"6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284"} Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204772 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1e78ae032b9557d03ea57a421dc5b2962405bd66d1c8415a0c89f4e9888284" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.204836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qjkwv" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.227140 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.228140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-5mf9t" event={"ID":"39b2afc0-76d7-48e9-8528-f88e3ba22955","Type":"ContainerDied","Data":"ede845938dcbb2c0e3303591186eb47bf17d10a92d1b0dd61b8430ff2dd6aa13"} Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.228221 4781 scope.go:117] "RemoveContainer" containerID="bb8c0d69bd70d80999cf07d7e8306d44a8648ef91de2762edd1a659e5f8fb1d6" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.293735 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.294021 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" containerID="cri-o://df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" gracePeriod=30 Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.294274 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" containerID="cri-o://ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" gracePeriod=30 Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.306501 4781 scope.go:117] "RemoveContainer" containerID="ba0fa606453c74eda00c418113d9f320bbbe55741c968eedcc82d3ff7571054d" Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.313699 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.343479 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.354911 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-5mf9t"] Feb 27 00:29:22 crc kubenswrapper[4781]: I0227 00:29:22.365697 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.236968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerStarted","Data":"78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902"} Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.237460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240356 4781 generic.go:334] "Generic (PLEG): container finished" podID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" exitCode=143 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240518 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" containerID="cri-o://d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.240747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.241179 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" containerID="cri-o://be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.241341 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" containerID="cri-o://bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" gracePeriod=30 Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.303360 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3069766400000002 podStartE2EDuration="7.303338343s" podCreationTimestamp="2026-02-27 00:29:16 +0000 UTC" firstStartedPulling="2026-02-27 00:29:17.040825984 +0000 UTC m=+1426.298365538" lastFinishedPulling="2026-02-27 00:29:22.037187687 +0000 UTC m=+1431.294727241" observedRunningTime="2026-02-27 00:29:23.288527283 +0000 UTC m=+1432.546066837" watchObservedRunningTime="2026-02-27 00:29:23.303338343 +0000 UTC m=+1432.560877897" Feb 27 00:29:23 crc kubenswrapper[4781]: I0227 00:29:23.324683 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" path="/var/lib/kubelet/pods/39b2afc0-76d7-48e9-8528-f88e3ba22955/volumes" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.252226 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ed68203-3ac6-4133-92d9-175f234d5229" containerID="be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" exitCode=0 Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.252601 4781 generic.go:334] "Generic (PLEG): container finished" podID="5ed68203-3ac6-4133-92d9-175f234d5229" containerID="bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" exitCode=143 Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ed68203-3ac6-4133-92d9-175f234d5229","Type":"ContainerDied","Data":"1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7"} Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.254330 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1402200c1e03e61992b31653885e83e4b38fc47073559c639e90d417e5d65cb7" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.325079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437529 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437796 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.437997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") pod \"5ed68203-3ac6-4133-92d9-175f234d5229\" (UID: \"5ed68203-3ac6-4133-92d9-175f234d5229\") " Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.438255 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs" (OuterVolumeSpecName: "logs") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.438801 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ed68203-3ac6-4133-92d9-175f234d5229-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.446676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq" (OuterVolumeSpecName: "kube-api-access-c25tq") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "kube-api-access-c25tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.466373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data" (OuterVolumeSpecName: "config-data") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.485619 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.496314 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5ed68203-3ac6-4133-92d9-175f234d5229" (UID: "5ed68203-3ac6-4133-92d9-175f234d5229"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541179 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c25tq\" (UniqueName: \"kubernetes.io/projected/5ed68203-3ac6-4133-92d9-175f234d5229-kube-api-access-c25tq\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541223 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541237 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:24 crc kubenswrapper[4781]: I0227 00:29:24.541250 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed68203-3ac6-4133-92d9-175f234d5229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.185607 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257167 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.257366 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") pod \"de057148-8197-4717-bbcc-636e6d64344a\" (UID: \"de057148-8197-4717-bbcc-636e6d64344a\") " Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264160 4781 generic.go:334] "Generic (PLEG): container finished" podID="de057148-8197-4717-bbcc-636e6d64344a" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" exitCode=0 Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264239 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264297 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerDied","Data":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de057148-8197-4717-bbcc-636e6d64344a","Type":"ContainerDied","Data":"5ecdf1c41abef4437c80f6d85c04db80a9d6858579c757ef6823795e81d59b23"} Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.264886 4781 scope.go:117] "RemoveContainer" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.278795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8" (OuterVolumeSpecName: "kube-api-access-9ccw8") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "kube-api-access-9ccw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.293307 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.303721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data" (OuterVolumeSpecName: "config-data") pod "de057148-8197-4717-bbcc-636e6d64344a" (UID: "de057148-8197-4717-bbcc-636e6d64344a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361292 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361335 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ccw8\" (UniqueName: \"kubernetes.io/projected/de057148-8197-4717-bbcc-636e6d64344a-kube-api-access-9ccw8\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.361354 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de057148-8197-4717-bbcc-636e6d64344a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.382210 4781 scope.go:117] "RemoveContainer" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.385006 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": container with ID starting with d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de not found: ID does not exist" containerID="d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.385059 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de"} err="failed to get container status \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": rpc error: code = NotFound desc = could not find container \"d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de\": container with ID starting with d4710a5ddae8764add45c7b1042f91b4a8ffc77d221c51eeb2b6c15be66002de not found: ID does not exist" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.394780 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.403699 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.415998 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416407 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416427 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416443 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416450 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416465 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416472 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416487 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416492 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416501 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416508 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: E0227 00:29:25.416523 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="init" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416530 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="init" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416741 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-log" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416754 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" containerName="nova-metadata-metadata" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416769 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" containerName="nova-manage" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416779 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b2afc0-76d7-48e9-8528-f88e3ba22955" containerName="dnsmasq-dns" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.416804 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="de057148-8197-4717-bbcc-636e6d64344a" containerName="nova-scheduler-scheduler" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.417791 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.420555 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.420791 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.446391 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462674 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.462851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564900 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564950 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.564985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.565577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.568529 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.568671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.570193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.586214 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"nova-metadata-0\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.622787 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.644075 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.663533 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.665241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.669744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.675845 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.734242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.769971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.770170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.770200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872876 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.872908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.880719 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.889948 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.890950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " pod="openstack/nova-scheduler-0" Feb 27 00:29:25 crc kubenswrapper[4781]: I0227 00:29:25.980544 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.195394 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:29:26 crc kubenswrapper[4781]: W0227 00:29:26.198807 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7524846b_772f_47a1_aaae_e7f29db2c0b5.slice/crio-38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91 WatchSource:0}: Error finding container 38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91: Status 404 returned error can't find the container with id 38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91 Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.275686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91"} Feb 27 00:29:26 crc kubenswrapper[4781]: I0227 00:29:26.492937 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:29:26 crc kubenswrapper[4781]: W0227 00:29:26.507068 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e01f6b_d306_41ac_9988_156063c5af7d.slice/crio-9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618 WatchSource:0}: Error finding container 9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618: Status 404 returned error can't find the container with id 9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618 Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.300362 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.301215 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerStarted","Data":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.302320 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerStarted","Data":"8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.302365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerStarted","Data":"9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.305888 4781 generic.go:334] "Generic (PLEG): container finished" podID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerID="39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110" exitCode=0 Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.305947 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerDied","Data":"39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110"} Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.331683 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.331665443 podStartE2EDuration="2.331665443s" podCreationTimestamp="2026-02-27 00:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:27.323341918 +0000 UTC m=+1436.580881502" watchObservedRunningTime="2026-02-27 00:29:27.331665443 +0000 UTC m=+1436.589204997" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.341377 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed68203-3ac6-4133-92d9-175f234d5229" path="/var/lib/kubelet/pods/5ed68203-3ac6-4133-92d9-175f234d5229/volumes" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.342209 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de057148-8197-4717-bbcc-636e6d64344a" path="/var/lib/kubelet/pods/de057148-8197-4717-bbcc-636e6d64344a/volumes" Feb 27 00:29:27 crc kubenswrapper[4781]: I0227 00:29:27.366856 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.366834733 podStartE2EDuration="2.366834733s" podCreationTimestamp="2026-02-27 00:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:27.356774352 +0000 UTC m=+1436.614313906" watchObservedRunningTime="2026-02-27 00:29:27.366834733 +0000 UTC m=+1436.624374287" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.271753 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321284 4781 generic.go:334] "Generic (PLEG): container finished" podID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" exitCode=0 Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321335 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321411 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"902efa6b-d07e-4589-b6e6-8016dfdbcd57","Type":"ContainerDied","Data":"10dbcf9aa331b09eb162dae4f7eb67ae5890ce7956c09aaa8725da5e211a8996"} Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.321429 4781 scope.go:117] "RemoveContainer" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.324788 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.324907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.325041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.325205 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") pod \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\" (UID: \"902efa6b-d07e-4589-b6e6-8016dfdbcd57\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.326921 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs" (OuterVolumeSpecName: "logs") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.330921 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp" (OuterVolumeSpecName: "kube-api-access-x7gxp") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "kube-api-access-x7gxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.358208 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.363500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data" (OuterVolumeSpecName: "config-data") pod "902efa6b-d07e-4589-b6e6-8016dfdbcd57" (UID: "902efa6b-d07e-4589-b6e6-8016dfdbcd57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432624 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gxp\" (UniqueName: \"kubernetes.io/projected/902efa6b-d07e-4589-b6e6-8016dfdbcd57-kube-api-access-x7gxp\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432699 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432713 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902efa6b-d07e-4589-b6e6-8016dfdbcd57-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.432754 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902efa6b-d07e-4589-b6e6-8016dfdbcd57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.536643 4781 scope.go:117] "RemoveContainer" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.571949 4781 scope.go:117] "RemoveContainer" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.583757 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": container with ID starting with ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e not found: ID does not exist" containerID="ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.583810 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e"} err="failed to get container status \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": rpc error: code = NotFound desc = could not find container \"ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e\": container with ID starting with ea25260e9b2c094b0c8bedd71e82fbdb32ad5e8cdbc2cbf11a8406927316566e not found: ID does not exist" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.583841 4781 scope.go:117] "RemoveContainer" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.584873 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": container with ID starting with df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03 not found: ID does not exist" containerID="df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.584918 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03"} err="failed to get container status \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": rpc error: code = NotFound desc = could not find container \"df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03\": container with ID starting with df5ce0884ce3591c063c1b53f0f9c2afd193cd3ff9ef524d87062b9d89532b03 not found: ID does not exist" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.659100 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.674859 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687324 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.687835 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687854 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: E0227 00:29:28.687869 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.687875 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.688110 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-log" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.688130 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" containerName="nova-api-api" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.689338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.692052 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.708431 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.731694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739225 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.739478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.841705 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.841925 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") pod \"b607db2c-2aa3-48f0-9cd8-c5461797431c\" (UID: \"b607db2c-2aa3-48f0-9cd8-c5461797431c\") " Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.842593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.843124 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.845588 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2" (OuterVolumeSpecName: "kube-api-access-f7dh2") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "kube-api-access-f7dh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.846862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.852179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts" (OuterVolumeSpecName: "scripts") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.852210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.860673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"nova-api-0\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " pod="openstack/nova-api-0" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.876568 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data" (OuterVolumeSpecName: "config-data") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.877269 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b607db2c-2aa3-48f0-9cd8-c5461797431c" (UID: "b607db2c-2aa3-48f0-9cd8-c5461797431c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944109 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944432 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944443 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7dh2\" (UniqueName: \"kubernetes.io/projected/b607db2c-2aa3-48f0-9cd8-c5461797431c-kube-api-access-f7dh2\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:28 crc kubenswrapper[4781]: I0227 00:29:28.944454 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b607db2c-2aa3-48f0-9cd8-c5461797431c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.025780 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.320019 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902efa6b-d07e-4589-b6e6-8016dfdbcd57" path="/var/lib/kubelet/pods/902efa6b-d07e-4589-b6e6-8016dfdbcd57/volumes" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334143 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" event={"ID":"b607db2c-2aa3-48f0-9cd8-c5461797431c","Type":"ContainerDied","Data":"34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8"} Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34decb527184bd4ff609a070de28e4da18ec18094d195f97919d2454806f58d8" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.334216 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tg9k8" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.419806 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: E0227 00:29:29.421253 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.421274 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.421458 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" containerName="nova-cell1-conductor-db-sync" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.422189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.425551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.430541 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.457312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.502677 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.558967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.559057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.559091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.566074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.566129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c40a18-7bbd-4d06-8a8a-427de95016fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.575503 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6mf\" (UniqueName: \"kubernetes.io/projected/c8c40a18-7bbd-4d06-8a8a-427de95016fa-kube-api-access-rz6mf\") pod \"nova-cell1-conductor-0\" (UID: \"c8c40a18-7bbd-4d06-8a8a-427de95016fa\") " pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:29 crc kubenswrapper[4781]: I0227 00:29:29.738286 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.193457 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 00:29:30 crc kubenswrapper[4781]: W0227 00:29:30.201909 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c40a18_7bbd_4d06_8a8a_427de95016fa.slice/crio-586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5 WatchSource:0}: Error finding container 586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5: Status 404 returned error can't find the container with id 586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5 Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347515 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.347586 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerStarted","Data":"52f4435171fea776734e465dadfa7d220c142ef75d0364376751af62a2757023"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.350617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8c40a18-7bbd-4d06-8a8a-427de95016fa","Type":"ContainerStarted","Data":"586fdf84ca64b6fcd432c2744daca0f29ae595ee9ccb98b66f85d651192101c5"} Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.375599 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.375578358 podStartE2EDuration="2.375578358s" podCreationTimestamp="2026-02-27 00:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:30.366971636 +0000 UTC m=+1439.624511210" watchObservedRunningTime="2026-02-27 00:29:30.375578358 +0000 UTC m=+1439.633117912" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.735408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.735467 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:29:30 crc kubenswrapper[4781]: I0227 00:29:30.981563 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.360830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8c40a18-7bbd-4d06-8a8a-427de95016fa","Type":"ContainerStarted","Data":"6f523ef0991ad019f7285afab5b492d902d65f84dce4c9da8e302ff112aac4c6"} Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.361640 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:31 crc kubenswrapper[4781]: I0227 00:29:31.381377 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.381357011 podStartE2EDuration="2.381357011s" podCreationTimestamp="2026-02-27 00:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:31.381189056 +0000 UTC m=+1440.638728610" watchObservedRunningTime="2026-02-27 00:29:31.381357011 +0000 UTC m=+1440.638896565" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.735345 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.736032 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:29:35 crc kubenswrapper[4781]: I0227 00:29:35.982224 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.026474 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.454885 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.749820 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:36 crc kubenswrapper[4781]: I0227 00:29:36.749895 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.028009 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.029203 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:29:39 crc kubenswrapper[4781]: I0227 00:29:39.770939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 27 00:29:40 crc kubenswrapper[4781]: I0227 00:29:40.109987 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:40 crc kubenswrapper[4781]: I0227 00:29:40.110041 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.227:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.739469 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.740042 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:29:45 crc kubenswrapper[4781]: I0227 00:29:45.744005 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:29:46 crc kubenswrapper[4781]: I0227 00:29:46.446250 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:29:46 crc kubenswrapper[4781]: I0227 00:29:46.522192 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:29:47 crc kubenswrapper[4781]: E0227 00:29:47.312409 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d780ba2_9829_430e_9a56_0b5b052bfbb7.slice/crio-conmon-7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527321 4781 generic.go:334] "Generic (PLEG): container finished" podID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerID="7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" exitCode=137 Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527414 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerDied","Data":"7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2"} Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8d780ba2-9829-430e-9a56-0b5b052bfbb7","Type":"ContainerDied","Data":"c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b"} Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.527481 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c962c4b95a57d2e6b554d146c58ed360df05b5f45a673155452c828cacdad50b" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.573698 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.736954 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.737138 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.737165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") pod \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\" (UID: \"8d780ba2-9829-430e-9a56-0b5b052bfbb7\") " Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.755476 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk" (OuterVolumeSpecName: "kube-api-access-bxdkk") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "kube-api-access-bxdkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.775270 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data" (OuterVolumeSpecName: "config-data") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.782762 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d780ba2-9829-430e-9a56-0b5b052bfbb7" (UID: "8d780ba2-9829-430e-9a56-0b5b052bfbb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839282 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839313 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d780ba2-9829-430e-9a56-0b5b052bfbb7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:47 crc kubenswrapper[4781]: I0227 00:29:47.839324 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxdkk\" (UniqueName: \"kubernetes.io/projected/8d780ba2-9829-430e-9a56-0b5b052bfbb7-kube-api-access-bxdkk\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.534870 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.566267 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.576276 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.592669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: E0227 00:29:48.593210 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.593232 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.593697 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.594704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.597552 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.597679 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.598061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.604171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.690417 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.695171 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697593 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.697776 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.698082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.722953 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.800940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.801025 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.801055 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.806113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.806156 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.807676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.815683 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3b399a8-7654-47f3-be04-759080f4f180-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.825177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfsn\" (UniqueName: \"kubernetes.io/projected/a3b399a8-7654-47f3-be04-759080f4f180-kube-api-access-fvfsn\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3b399a8-7654-47f3-be04-759080f4f180\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.903224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.905278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.905498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.910800 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:48 crc kubenswrapper[4781]: I0227 00:29:48.923972 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"redhat-operators-49thr\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.013512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.044946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.046301 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.055571 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.060858 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.332295 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d780ba2-9829-430e-9a56-0b5b052bfbb7" path="/var/lib/kubelet/pods/8d780ba2-9829-430e-9a56-0b5b052bfbb7/volumes" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.443420 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 00:29:49 crc kubenswrapper[4781]: W0227 00:29:49.448431 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b399a8_7654_47f3_be04_759080f4f180.slice/crio-756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7 WatchSource:0}: Error finding container 756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7: Status 404 returned error can't find the container with id 756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7 Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.551843 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3b399a8-7654-47f3-be04-759080f4f180","Type":"ContainerStarted","Data":"756fbef35db868ba808ee405957034dabe2c897cd150b7acda8df14ae20dd8f7"} Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.551887 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.564881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.623959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:29:49 crc kubenswrapper[4781]: W0227 00:29:49.624240 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2a000c_f169_4622_8c82_cd4c2baa730a.slice/crio-91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616 WatchSource:0}: Error finding container 91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616: Status 404 returned error can't find the container with id 91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616 Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.743288 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.753381 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.761435 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839559 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839670 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.839850 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942308 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.942398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.943248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.943840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.944331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.945129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.945510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:49 crc kubenswrapper[4781]: I0227 00:29:49.966227 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"dnsmasq-dns-5fd9b586ff-hm24r\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.094087 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.562398 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3b399a8-7654-47f3-be04-759080f4f180","Type":"ContainerStarted","Data":"94ca6f18e3eef7f80d8723d209ec5d280431dc24efb36670ddfc4b66fb3e818e"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565342 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" exitCode=0 Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.565599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616"} Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.671374 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.671331943 podStartE2EDuration="2.671331943s" podCreationTimestamp="2026-02-27 00:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:50.58573778 +0000 UTC m=+1459.843277344" watchObservedRunningTime="2026-02-27 00:29:50.671331943 +0000 UTC m=+1459.928871487" Feb 27 00:29:50 crc kubenswrapper[4781]: W0227 00:29:50.727324 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bd7379_6c29_4c0f_bb7e_14c18f98a18e.slice/crio-bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71 WatchSource:0}: Error finding container bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71: Status 404 returned error can't find the container with id bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71 Feb 27 00:29:50 crc kubenswrapper[4781]: I0227 00:29:50.774575 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.578993 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerID="0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947" exitCode=0 Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.579198 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947"} Feb 27 00:29:51 crc kubenswrapper[4781]: I0227 00:29:51.580705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerStarted","Data":"bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.590812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerStarted","Data":"e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.591978 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.592719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.615012 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" podStartSLOduration=3.614992533 podStartE2EDuration="3.614992533s" podCreationTimestamp="2026-02-27 00:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:52.606159984 +0000 UTC m=+1461.863699548" watchObservedRunningTime="2026-02-27 00:29:52.614992533 +0000 UTC m=+1461.872532087" Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.761474 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.761824 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" containerID="cri-o://3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762315 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" containerID="cri-o://78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762390 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" containerID="cri-o://fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" gracePeriod=30 Feb 27 00:29:52 crc kubenswrapper[4781]: I0227 00:29:52.762442 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" containerID="cri-o://9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605225 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" exitCode=0 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605464 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" exitCode=2 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605471 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" exitCode=0 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605551 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.605562 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1"} Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614217 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614511 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" containerID="cri-o://45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.614708 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" containerID="cri-o://b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" gracePeriod=30 Feb 27 00:29:53 crc kubenswrapper[4781]: I0227 00:29:53.912173 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:54 crc kubenswrapper[4781]: I0227 00:29:54.615145 4781 generic.go:334] "Generic (PLEG): container finished" podID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" exitCode=143 Feb 27 00:29:54 crc kubenswrapper[4781]: I0227 00:29:54.615210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.642024 4781 generic.go:334] "Generic (PLEG): container finished" podID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerID="9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" exitCode=0 Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.642095 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19"} Feb 27 00:29:56 crc kubenswrapper[4781]: I0227 00:29:56.998416 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099599 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099668 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099761 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.099948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") pod \"7825d67f-c124-4ee9-9e74-32c35c4370c0\" (UID: \"7825d67f-c124-4ee9-9e74-32c35c4370c0\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.101025 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.101479 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.104915 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts" (OuterVolumeSpecName: "scripts") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.118944 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk" (OuterVolumeSpecName: "kube-api-access-tthpk") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "kube-api-access-tthpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.154814 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.201854 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthpk\" (UniqueName: \"kubernetes.io/projected/7825d67f-c124-4ee9-9e74-32c35c4370c0-kube-api-access-tthpk\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.201962 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202014 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202074 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202130 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7825d67f-c124-4ee9-9e74-32c35c4370c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.202300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.236530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data" (OuterVolumeSpecName: "config-data") pod "7825d67f-c124-4ee9-9e74-32c35c4370c0" (UID: "7825d67f-c124-4ee9-9e74-32c35c4370c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.260543 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303084 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303236 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303310 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.303349 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") pod \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\" (UID: \"e57ffac0-932b-42fd-bc09-ae357b25eeb1\") " Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304003 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304044 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7825d67f-c124-4ee9-9e74-32c35c4370c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.304483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs" (OuterVolumeSpecName: "logs") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.312317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n" (OuterVolumeSpecName: "kube-api-access-p577n") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "kube-api-access-p577n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.331643 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data" (OuterVolumeSpecName: "config-data") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.359502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e57ffac0-932b-42fd-bc09-ae357b25eeb1" (UID: "e57ffac0-932b-42fd-bc09-ae357b25eeb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405574 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405604 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e57ffac0-932b-42fd-bc09-ae357b25eeb1-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405616 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p577n\" (UniqueName: \"kubernetes.io/projected/e57ffac0-932b-42fd-bc09-ae357b25eeb1-kube-api-access-p577n\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.405641 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57ffac0-932b-42fd-bc09-ae357b25eeb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.612381 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7825d67f_c124_4ee9_9e74_32c35c4370c0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7825d67f_c124_4ee9_9e74_32c35c4370c0.slice/crio-a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d\": RecentStats: unable to find data in memory cache]" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.654199 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" exitCode=0 Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.654287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657072 4781 generic.go:334] "Generic (PLEG): container finished" podID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" exitCode=0 Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657126 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657158 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e57ffac0-932b-42fd-bc09-ae357b25eeb1","Type":"ContainerDied","Data":"52f4435171fea776734e465dadfa7d220c142ef75d0364376751af62a2757023"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657170 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.657176 4781 scope.go:117] "RemoveContainer" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.661650 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7825d67f-c124-4ee9-9e74-32c35c4370c0","Type":"ContainerDied","Data":"a90b7ced7061699d62e894c9b3b31c21fe93acf06b438953563f0da53923c22d"} Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.661724 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.696619 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.709166 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.711024 4781 scope.go:117] "RemoveContainer" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.721671 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.738944 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.750357 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751271 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751378 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751519 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751634 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751740 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.751830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.751923 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752011 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.752105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.752315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752405 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752843 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="sg-core" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.752972 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="proxy-httpd" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753072 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-central-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-api" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753275 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" containerName="ceilometer-notification-agent" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.753351 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" containerName="nova-api-log" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.756476 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.761429 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.761511 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.763931 4781 scope.go:117] "RemoveContainer" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.767396 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": container with ID starting with b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d not found: ID does not exist" containerID="b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.767540 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d"} err="failed to get container status \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": rpc error: code = NotFound desc = could not find container \"b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d\": container with ID starting with b178a99be777daf87be803a4cbee7758b07e3b3c2d83d9e051bfa4db85be0c6d not found: ID does not exist" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.767660 4781 scope.go:117] "RemoveContainer" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: E0227 00:29:57.768075 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": container with ID starting with 45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922 not found: ID does not exist" containerID="45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.768175 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922"} err="failed to get container status \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": rpc error: code = NotFound desc = could not find container \"45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922\": container with ID starting with 45fa49853f49c290824390a083c818fc5bca5ced860bd643b7205e56c631d922 not found: ID does not exist" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.768267 4781 scope.go:117] "RemoveContainer" containerID="78c52f488afaab989176f5c5ab096fb61a4e74a72dc5d52ce83048b14f67d902" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.766836 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.780843 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.782601 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.787904 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.788216 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.788413 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.796419 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.814044 4781 scope.go:117] "RemoveContainer" containerID="fd9909df11f574e0138a430f34c72bf18ace2c57464e54425e45df0b7fd14f75" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.819865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.820279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.833617 4781 scope.go:117] "RemoveContainer" containerID="9a609615b1e77c141503575f4b85bd73b7b9605cdd075e949757163fb3230f19" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.854068 4781 scope.go:117] "RemoveContainer" containerID="3ddb72adfd8dadbe432eb551c304f261946dae5663273b00c2b5c6ab9ec5b0b1" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922432 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922659 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.922782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923230 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.923896 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.928266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.928852 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.929218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.929371 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.930243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.934621 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.934729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.940515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.940504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"ceilometer-0\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " pod="openstack/ceilometer-0" Feb 27 00:29:57 crc kubenswrapper[4781]: I0227 00:29:57.943636 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"nova-api-0\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " pod="openstack/nova-api-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.110826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.116338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.414054 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.414715 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" containerID="cri-o://59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" gracePeriod=30 Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.619023 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.679029 4781 generic.go:334] "Generic (PLEG): container finished" podID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerID="59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" exitCode=2 Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.679105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerDied","Data":"59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.685162 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"2c4686839ed88b8a45c07bbc45e5d7e8f95577bd88f8f5ed02b133c4326a106e"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.692951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerStarted","Data":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.720278 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49thr" podStartSLOduration=3.212644276 podStartE2EDuration="10.720259104s" podCreationTimestamp="2026-02-27 00:29:48 +0000 UTC" firstStartedPulling="2026-02-27 00:29:50.573078578 +0000 UTC m=+1459.830618122" lastFinishedPulling="2026-02-27 00:29:58.080693386 +0000 UTC m=+1467.338232950" observedRunningTime="2026-02-27 00:29:58.709088422 +0000 UTC m=+1467.966627976" watchObservedRunningTime="2026-02-27 00:29:58.720259104 +0000 UTC m=+1467.977798658" Feb 27 00:29:58 crc kubenswrapper[4781]: W0227 00:29:58.733304 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ecbcf96_0260_4e87_afe5_9acc6098ec59.slice/crio-b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd WatchSource:0}: Error finding container b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd: Status 404 returned error can't find the container with id b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.738373 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.911448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:58 crc kubenswrapper[4781]: I0227 00:29:58.941485 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.013954 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.013995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.030521 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.048679 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") pod \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\" (UID: \"91997a3e-9e65-4eab-a0b9-8f9c639a8d05\") " Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.053957 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5" (OuterVolumeSpecName: "kube-api-access-hr9c5") pod "91997a3e-9e65-4eab-a0b9-8f9c639a8d05" (UID: "91997a3e-9e65-4eab-a0b9-8f9c639a8d05"). InnerVolumeSpecName "kube-api-access-hr9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.153055 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr9c5\" (UniqueName: \"kubernetes.io/projected/91997a3e-9e65-4eab-a0b9-8f9c639a8d05-kube-api-access-hr9c5\") on node \"crc\" DevicePath \"\"" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.319877 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7825d67f-c124-4ee9-9e74-32c35c4370c0" path="/var/lib/kubelet/pods/7825d67f-c124-4ee9-9e74-32c35c4370c0/volumes" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.320610 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57ffac0-932b-42fd-bc09-ae357b25eeb1" path="/var/lib/kubelet/pods/e57ffac0-932b-42fd-bc09-ae357b25eeb1/volumes" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736931 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.736949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerStarted","Data":"b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"91997a3e-9e65-4eab-a0b9-8f9c639a8d05","Type":"ContainerDied","Data":"40455368adae6ae08a72863a733dea4cab1b575394a61b8c7f6b3e11518f1446"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743080 4781 scope.go:117] "RemoveContainer" containerID="59ed5bb57f5c002905a336da46ce8019d8424d181ddbd01fd683c6c25bea9d90" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.743231 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.747052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5"} Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.773918 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.773900059 podStartE2EDuration="2.773900059s" podCreationTimestamp="2026-02-27 00:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:29:59.759941952 +0000 UTC m=+1469.017481516" watchObservedRunningTime="2026-02-27 00:29:59.773900059 +0000 UTC m=+1469.031439613" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.781661 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.910175 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.949425 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.960256 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:29:59 crc kubenswrapper[4781]: E0227 00:29:59.962671 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.962718 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.963964 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" containerName="kube-state-metrics" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.968232 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.971014 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.971394 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 27 00:29:59 crc kubenswrapper[4781]: I0227 00:29:59.973099 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.045959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.048200 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.051748 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.058935 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.060182 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.065525 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:00 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:00 crc kubenswrapper[4781]: > Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.083648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.096817 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.153078 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.155028 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.157339 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.157655 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.162134 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185369 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185433 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185466 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.185588 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.194577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.194763 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.196398 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.196672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.198266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25933928-b136-4b38-955a-46a3d802a62b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.199873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.200090 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.218179 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.220795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bp4c\" (UniqueName: \"kubernetes.io/projected/25933928-b136-4b38-955a-46a3d802a62b-kube-api-access-9bp4c\") pod \"kube-state-metrics-0\" (UID: \"25933928-b136-4b38-955a-46a3d802a62b\") " pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.233976 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.251292 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.251585 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" containerID="cri-o://26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" gracePeriod=10 Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.288647 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299035 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299549 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.299734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.322580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"nova-cell1-cell-mapping-6twxl\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.367785 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.390977 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.391435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.415308 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"auto-csr-approver-29535870-pjv2c\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.493342 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495752 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.495798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.498531 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.500573 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.525380 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"collect-profiles-29535870-kzqvl\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.643600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.786913 4781 generic.go:334] "Generic (PLEG): container finished" podID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerID="26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" exitCode=0 Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.787193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249"} Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.825818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36"} Feb 27 00:30:00 crc kubenswrapper[4781]: I0227 00:30:00.944075 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031763 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.031975 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.032027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") pod \"5f47f2d5-f4d5-448d-9355-ebe37959b584\" (UID: \"5f47f2d5-f4d5-448d-9355-ebe37959b584\") " Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.049297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6" (OuterVolumeSpecName: "kube-api-access-f9vk6") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "kube-api-access-f9vk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.128868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.129412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135571 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135606 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.135618 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vk6\" (UniqueName: \"kubernetes.io/projected/5f47f2d5-f4d5-448d-9355-ebe37959b584-kube-api-access-f9vk6\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.140170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.141563 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.176453 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config" (OuterVolumeSpecName: "config") pod "5f47f2d5-f4d5-448d-9355-ebe37959b584" (UID: "5f47f2d5-f4d5-448d-9355-ebe37959b584"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.200319 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237546 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237584 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.237606 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f47f2d5-f4d5-448d-9355-ebe37959b584-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.404855 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91997a3e-9e65-4eab-a0b9-8f9c639a8d05" path="/var/lib/kubelet/pods/91997a3e-9e65-4eab-a0b9-8f9c639a8d05/volumes" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.437761 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.449719 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.468698 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.724913 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.836009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerStarted","Data":"603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.836051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerStarted","Data":"2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.841838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.850976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" event={"ID":"5f47f2d5-f4d5-448d-9355-ebe37959b584","Type":"ContainerDied","Data":"e0e61b6d097a768cedf938a2051e02fe6b26d59774f1dfea50ad4f92d0779d0a"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.851031 4781 scope.go:117] "RemoveContainer" containerID="26d79208d95dcfd480e6dcf5e635ea74d70976218b9d0db2771a4aca513d9249" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.851186 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.854681 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6twxl" podStartSLOduration=1.854669254 podStartE2EDuration="1.854669254s" podCreationTimestamp="2026-02-27 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:01.853594105 +0000 UTC m=+1471.111133649" watchObservedRunningTime="2026-02-27 00:30:01.854669254 +0000 UTC m=+1471.112208808" Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.856121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerStarted","Data":"ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.859645 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25933928-b136-4b38-955a-46a3d802a62b","Type":"ContainerStarted","Data":"7b47bd884aeaea9a9ce08ffea52c5f963733140e3aa935098889fe9feffbc5ef"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.863042 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerStarted","Data":"5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda"} Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.893570 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.928043 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-l4cw7"] Feb 27 00:30:01 crc kubenswrapper[4781]: I0227 00:30:01.933190 4781 scope.go:117] "RemoveContainer" containerID="363437972dc1edd0a85fa61204497c017a7b8e034221df5e68a301f8138ef7f7" Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.874213 4781 generic.go:334] "Generic (PLEG): container finished" podID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerID="d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab" exitCode=0 Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.874314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerDied","Data":"d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab"} Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.878082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"25933928-b136-4b38-955a-46a3d802a62b","Type":"ContainerStarted","Data":"1dc221fdcf5400feadbccc6ac0f50f82af7a44c7d54118a608923e78070e2715"} Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.878302 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 00:30:02 crc kubenswrapper[4781]: I0227 00:30:02.913480 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.37756933 podStartE2EDuration="3.913460488s" podCreationTimestamp="2026-02-27 00:29:59 +0000 UTC" firstStartedPulling="2026-02-27 00:30:01.395433647 +0000 UTC m=+1470.652973201" lastFinishedPulling="2026-02-27 00:30:01.931324805 +0000 UTC m=+1471.188864359" observedRunningTime="2026-02-27 00:30:02.903852498 +0000 UTC m=+1472.161392082" watchObservedRunningTime="2026-02-27 00:30:02.913460488 +0000 UTC m=+1472.171000042" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.344614 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" path="/var/lib/kubelet/pods/5f47f2d5-f4d5-448d-9355-ebe37959b584/volumes" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897924 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerStarted","Data":"12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1"} Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897919 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" containerID="cri-o://6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897966 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" containerID="cri-o://12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897976 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" containerID="cri-o://fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.898430 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.897986 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" containerID="cri-o://7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" gracePeriod=30 Feb 27 00:30:03 crc kubenswrapper[4781]: I0227 00:30:03.934045 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.335436315 podStartE2EDuration="6.93402468s" podCreationTimestamp="2026-02-27 00:29:57 +0000 UTC" firstStartedPulling="2026-02-27 00:29:58.627340134 +0000 UTC m=+1467.884879688" lastFinishedPulling="2026-02-27 00:30:03.225928499 +0000 UTC m=+1472.483468053" observedRunningTime="2026-02-27 00:30:03.927783811 +0000 UTC m=+1473.185323365" watchObservedRunningTime="2026-02-27 00:30:03.93402468 +0000 UTC m=+1473.191564234" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.496681 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.584772 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") pod \"eb78ed91-75d4-40d9-9359-da1c3878e145\" (UID: \"eb78ed91-75d4-40d9-9359-da1c3878e145\") " Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.585426 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.590712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.606903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4" (OuterVolumeSpecName: "kube-api-access-l4rc4") pod "eb78ed91-75d4-40d9-9359-da1c3878e145" (UID: "eb78ed91-75d4-40d9-9359-da1c3878e145"). InnerVolumeSpecName "kube-api-access-l4rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687025 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4rc4\" (UniqueName: \"kubernetes.io/projected/eb78ed91-75d4-40d9-9359-da1c3878e145-kube-api-access-l4rc4\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687230 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb78ed91-75d4-40d9-9359-da1c3878e145-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.687328 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb78ed91-75d4-40d9-9359-da1c3878e145-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.908981 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" exitCode=2 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909317 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" exitCode=0 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.909379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.910990 4781 generic.go:334] "Generic (PLEG): container finished" podID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerID="2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8" exitCode=0 Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.911065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerDied","Data":"2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912408 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" event={"ID":"eb78ed91-75d4-40d9-9359-da1c3878e145","Type":"ContainerDied","Data":"5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda"} Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912430 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd01220d04e81eb4ff121788a86bb3729ff13c080727f1bce4d0bfcd72babda" Feb 27 00:30:04 crc kubenswrapper[4781]: I0227 00:30:04.912484 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl" Feb 27 00:30:05 crc kubenswrapper[4781]: I0227 00:30:05.618066 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78cd565959-l4cw7" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.221:5353: i/o timeout" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.391363 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.425231 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") pod \"b8ec74af-d604-42ac-83bb-db047e8d8506\" (UID: \"b8ec74af-d604-42ac-83bb-db047e8d8506\") " Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.434408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt" (OuterVolumeSpecName: "kube-api-access-jqmlt") pod "b8ec74af-d604-42ac-83bb-db047e8d8506" (UID: "b8ec74af-d604-42ac-83bb-db047e8d8506"). InnerVolumeSpecName "kube-api-access-jqmlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.530558 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqmlt\" (UniqueName: \"kubernetes.io/projected/b8ec74af-d604-42ac-83bb-db047e8d8506-kube-api-access-jqmlt\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" event={"ID":"b8ec74af-d604-42ac-83bb-db047e8d8506","Type":"ContainerDied","Data":"ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0"} Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933452 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4590128cbde68a4a47c1669a269597aaf13fb7275d8a433b383eae02651ba0" Feb 27 00:30:06 crc kubenswrapper[4781]: I0227 00:30:06.933122 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535870-pjv2c" Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.480586 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.495786 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535864-cfd4d"] Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.943974 4781 generic.go:334] "Generic (PLEG): container finished" podID="27b0d2a5-5629-42a0-8884-a5534240b356" containerID="603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e" exitCode=0 Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.944069 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerDied","Data":"603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e"} Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.946949 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" exitCode=0 Feb 27 00:30:07 crc kubenswrapper[4781]: I0227 00:30:07.946991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5"} Feb 27 00:30:08 crc kubenswrapper[4781]: I0227 00:30:08.116677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:08 crc kubenswrapper[4781]: I0227 00:30:08.116726 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.133812 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.133825 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.233:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.326531 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9402a6e-66bb-4e1e-a33f-7fce411c83b8" path="/var/lib/kubelet/pods/b9402a6e-66bb-4e1e-a33f-7fce411c83b8/volumes" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.408288 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.485812 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.485977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.486015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.486049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") pod \"27b0d2a5-5629-42a0-8884-a5534240b356\" (UID: \"27b0d2a5-5629-42a0-8884-a5534240b356\") " Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.503788 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts" (OuterVolumeSpecName: "scripts") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.503844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl" (OuterVolumeSpecName: "kube-api-access-xpxdl") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "kube-api-access-xpxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.519905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data" (OuterVolumeSpecName: "config-data") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.519996 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b0d2a5-5629-42a0-8884-a5534240b356" (UID: "27b0d2a5-5629-42a0-8884-a5534240b356"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588868 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588906 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588918 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpxdl\" (UniqueName: \"kubernetes.io/projected/27b0d2a5-5629-42a0-8884-a5534240b356-kube-api-access-xpxdl\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.588931 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b0d2a5-5629-42a0-8884-a5534240b356-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6twxl" event={"ID":"27b0d2a5-5629-42a0-8884-a5534240b356","Type":"ContainerDied","Data":"2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed"} Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972054 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4718337e97959ef32ed9d78c1825b06db8a7a61e70b8f1c8473596ad38ebed" Feb 27 00:30:09 crc kubenswrapper[4781]: I0227 00:30:09.972070 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6twxl" Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.062412 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:10 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:10 crc kubenswrapper[4781]: > Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163530 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163767 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" containerID="cri-o://a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.163883 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" containerID="cri-o://4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.185382 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.185715 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" containerID="cri-o://8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.220939 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.221795 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" containerID="cri-o://8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.225805 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" containerID="cri-o://a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" gracePeriod=30 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.332468 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.982668 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.988715 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.992147 4781 generic.go:334] "Generic (PLEG): container finished" podID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" exitCode=143 Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.992160 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 00:30:10 crc kubenswrapper[4781]: E0227 00:30:10.992217 4781 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.992227 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.996818 4781 generic.go:334] "Generic (PLEG): container finished" podID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerID="a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" exitCode=143 Feb 27 00:30:10 crc kubenswrapper[4781]: I0227 00:30:10.996861 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718"} Feb 27 00:30:12 crc kubenswrapper[4781]: I0227 00:30:12.895471 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:30:12 crc kubenswrapper[4781]: I0227 00:30:12.895875 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.378869 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:48616->10.217.0.225:8775: read: connection reset by peer" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.378958 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:48614->10.217.0.225:8775: read: connection reset by peer" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.906137 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978903 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.978966 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.979043 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.979182 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") pod \"7524846b-772f-47a1-aaae-e7f29db2c0b5\" (UID: \"7524846b-772f-47a1-aaae-e7f29db2c0b5\") " Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.980263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs" (OuterVolumeSpecName: "logs") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:13 crc kubenswrapper[4781]: I0227 00:30:13.988849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf" (OuterVolumeSpecName: "kube-api-access-94rnf") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "kube-api-access-94rnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.014543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.019389 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data" (OuterVolumeSpecName: "config-data") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037689 4781 generic.go:334] "Generic (PLEG): container finished" podID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" exitCode=0 Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7524846b-772f-47a1-aaae-e7f29db2c0b5","Type":"ContainerDied","Data":"38c766ee39e3c7f63ba0e025dfac0a7e85784b006de99cf2ffb71128d62e3b91"} Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037773 4781 scope.go:117] "RemoveContainer" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.037912 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.044499 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7524846b-772f-47a1-aaae-e7f29db2c0b5" (UID: "7524846b-772f-47a1-aaae-e7f29db2c0b5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116811 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7524846b-772f-47a1-aaae-e7f29db2c0b5-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116845 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116855 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116864 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7524846b-772f-47a1-aaae-e7f29db2c0b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.116872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rnf\" (UniqueName: \"kubernetes.io/projected/7524846b-772f-47a1-aaae-e7f29db2c0b5-kube-api-access-94rnf\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.139973 4781 scope.go:117] "RemoveContainer" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.172666 4781 scope.go:117] "RemoveContainer" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.173091 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": container with ID starting with 8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf not found: ID does not exist" containerID="8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173120 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf"} err="failed to get container status \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": rpc error: code = NotFound desc = could not find container \"8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf\": container with ID starting with 8b0334950030ff6d04fa6ce0a9d86218c27e54eace8f56b035169aad1a3acccf not found: ID does not exist" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173142 4781 scope.go:117] "RemoveContainer" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.173444 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": container with ID starting with a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9 not found: ID does not exist" containerID="a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.173462 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9"} err="failed to get container status \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": rpc error: code = NotFound desc = could not find container \"a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9\": container with ID starting with a1babffa87bc8c759ec31af189e69efb911ccecd50fe63e3bf9d7c05e890f1a9 not found: ID does not exist" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.369642 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.380279 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.393024 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.393756 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="init" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.393846 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="init" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.393938 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394044 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394153 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394238 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394312 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394377 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394459 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394549 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394666 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394748 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: E0227 00:30:14.394822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.394884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.395196 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f47f2d5-f4d5-448d-9355-ebe37959b584" containerName="dnsmasq-dns" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396765 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" containerName="collect-profiles" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396851 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" containerName="nova-manage" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396865 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" containerName="oc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396883 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-log" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.396895 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" containerName="nova-metadata-metadata" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.398368 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.402534 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.402918 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.419001 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.527317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629617 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629728 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.629855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.630298 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e32c3573-4acb-4d70-aa6e-2d647c108931-logs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637344 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637499 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-config-data\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.637539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e32c3573-4acb-4d70-aa6e-2d647c108931-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.656963 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd4g\" (UniqueName: \"kubernetes.io/projected/e32c3573-4acb-4d70-aa6e-2d647c108931-kube-api-access-gmd4g\") pod \"nova-metadata-0\" (UID: \"e32c3573-4acb-4d70-aa6e-2d647c108931\") " pod="openstack/nova-metadata-0" Feb 27 00:30:14 crc kubenswrapper[4781]: I0227 00:30:14.716264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.048637 4781 generic.go:334] "Generic (PLEG): container finished" podID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" exitCode=0 Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.048999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerDied","Data":"8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab"} Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.052439 4781 generic.go:334] "Generic (PLEG): container finished" podID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerID="4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" exitCode=0 Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.052473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca"} Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.160710 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241774 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241828 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.241944 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.242015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") pod \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\" (UID: \"0ecbcf96-0260-4e87-afe5-9acc6098ec59\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.243409 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs" (OuterVolumeSpecName: "logs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.246337 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp" (OuterVolumeSpecName: "kube-api-access-vmfqp") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "kube-api-access-vmfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.284415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data" (OuterVolumeSpecName: "config-data") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.288519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.313763 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.328259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ecbcf96-0260-4e87-afe5-9acc6098ec59" (UID: "0ecbcf96-0260-4e87-afe5-9acc6098ec59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.330477 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7524846b-772f-47a1-aaae-e7f29db2c0b5" path="/var/lib/kubelet/pods/7524846b-772f-47a1-aaae-e7f29db2c0b5/volumes" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353557 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecbcf96-0260-4e87-afe5-9acc6098ec59-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353653 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353666 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353680 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353698 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmfqp\" (UniqueName: \"kubernetes.io/projected/0ecbcf96-0260-4e87-afe5-9acc6098ec59-kube-api-access-vmfqp\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.353707 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecbcf96-0260-4e87-afe5-9acc6098ec59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.397649 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.411175 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455304 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.455695 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") pod \"f5e01f6b-d306-41ac-9988-156063c5af7d\" (UID: \"f5e01f6b-d306-41ac-9988-156063c5af7d\") " Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.461492 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78" (OuterVolumeSpecName: "kube-api-access-mrw78") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "kube-api-access-mrw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.499477 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data" (OuterVolumeSpecName: "config-data") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.513858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5e01f6b-d306-41ac-9988-156063c5af7d" (UID: "f5e01f6b-d306-41ac-9988-156063c5af7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560115 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrw78\" (UniqueName: \"kubernetes.io/projected/f5e01f6b-d306-41ac-9988-156063c5af7d-kube-api-access-mrw78\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560168 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:15 crc kubenswrapper[4781]: I0227 00:30:15.560179 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e01f6b-d306-41ac-9988-156063c5af7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"edef59ffff3d0873180602792810e3097a50cc4082ff788c05564c40ecb2297b"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"bc44e17aecd79eaa2526351b746c2930a0882074242ccf381e0b8d197a5ac152"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.065450 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e32c3573-4acb-4d70-aa6e-2d647c108931","Type":"ContainerStarted","Data":"016c4f8330bcc7bd961914e21614eb5a9ff7ba7a7613602e0f9713edabc22f78"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5e01f6b-d306-41ac-9988-156063c5af7d","Type":"ContainerDied","Data":"9fd55676c924089cae058f2bc5bdb6090578b3476a36c5a733237f94e45c9618"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067565 4781 scope.go:117] "RemoveContainer" containerID="8a434297e1d497ddfa074c1233744a9c79e7a3482bb8e37e36657a3849467eab" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.067621 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.070169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecbcf96-0260-4e87-afe5-9acc6098ec59","Type":"ContainerDied","Data":"b6418bc15cd13e8d0bd6da20db363f8f00d774a0d59f04ab4104216089ac4cbd"} Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.070247 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.109159 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.109135545 podStartE2EDuration="2.109135545s" podCreationTimestamp="2026-02-27 00:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:16.086869793 +0000 UTC m=+1485.344409367" watchObservedRunningTime="2026-02-27 00:30:16.109135545 +0000 UTC m=+1485.366675099" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.109235 4781 scope.go:117] "RemoveContainer" containerID="4fb1191f2a33c534b5ff081b75c12582b7d083d0914703fd100fe77a443ffaca" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.143707 4781 scope.go:117] "RemoveContainer" containerID="a1f456672f54264ca1c600a8f933a178f9e17b564318587aad61468738ce8718" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.144674 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.175751 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.200221 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.216925 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.225984 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226436 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226454 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226465 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226471 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: E0227 00:30:16.226484 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226492 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226734 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-api" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226759 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" containerName="nova-api-log" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.226774 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" containerName="nova-scheduler-scheduler" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.227826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230233 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.230747 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.273763 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.280754 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.290225 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292312 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292472 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.292869 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.293403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.293766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.296813 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.305302 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395526 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395708 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395800 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.395956 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.396148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.397668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e258c11-5caa-4d6b-ab77-841ddf83ac81-logs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.400529 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401104 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-config-data\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.401226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e258c11-5caa-4d6b-ab77-841ddf83ac81-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.422101 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv52c\" (UniqueName: \"kubernetes.io/projected/4e258c11-5caa-4d6b-ab77-841ddf83ac81-kube-api-access-zv52c\") pod \"nova-api-0\" (UID: \"4e258c11-5caa-4d6b-ab77-841ddf83ac81\") " pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.497994 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.498105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.498174 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.501854 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-config-data\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.502614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.514751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc6w\" (UniqueName: \"kubernetes.io/projected/1d7f8c00-d318-4f7d-b67e-6743c3a82dae-kube-api-access-hqc6w\") pod \"nova-scheduler-0\" (UID: \"1d7f8c00-d318-4f7d-b67e-6743c3a82dae\") " pod="openstack/nova-scheduler-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.557383 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 00:30:16 crc kubenswrapper[4781]: I0227 00:30:16.623887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.056471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 00:30:17 crc kubenswrapper[4781]: W0227 00:30:17.059077 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e258c11_5caa_4d6b_ab77_841ddf83ac81.slice/crio-d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d WatchSource:0}: Error finding container d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d: Status 404 returned error can't find the container with id d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.081032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"d68ab3f956ceb1f76180523b1da9ac1e6a71f2faf4fcadcc14a72c8376f2f86d"} Feb 27 00:30:17 crc kubenswrapper[4781]: W0227 00:30:17.176470 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d7f8c00_d318_4f7d_b67e_6743c3a82dae.slice/crio-4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778 WatchSource:0}: Error finding container 4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778: Status 404 returned error can't find the container with id 4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778 Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.180907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.325061 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecbcf96-0260-4e87-afe5-9acc6098ec59" path="/var/lib/kubelet/pods/0ecbcf96-0260-4e87-afe5-9acc6098ec59/volumes" Feb 27 00:30:17 crc kubenswrapper[4781]: I0227 00:30:17.328883 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e01f6b-d306-41ac-9988-156063c5af7d" path="/var/lib/kubelet/pods/f5e01f6b-d306-41ac-9988-156063c5af7d/volumes" Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.098138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"700884efc89dd85a512e20845640f310b33900ef785ea60c53d7aa26f85af38d"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.098519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e258c11-5caa-4d6b-ab77-841ddf83ac81","Type":"ContainerStarted","Data":"06c91865ba81c14d409c91073f4ed9bfdd0ca7faac32ba2130c7a442b7dc699c"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.100471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d7f8c00-d318-4f7d-b67e-6743c3a82dae","Type":"ContainerStarted","Data":"eb3d1f00e242dd2b542c92285321576cfdd7b7c9a7730699acee14eb1633cb42"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.100530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d7f8c00-d318-4f7d-b67e-6743c3a82dae","Type":"ContainerStarted","Data":"4aaddd97e770ae5123af55ed21c62a17a6d5ebf9815ca468e08d021529fa6778"} Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.126600 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.126558997 podStartE2EDuration="2.126558997s" podCreationTimestamp="2026-02-27 00:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:18.120141394 +0000 UTC m=+1487.377680948" watchObservedRunningTime="2026-02-27 00:30:18.126558997 +0000 UTC m=+1487.384098551" Feb 27 00:30:18 crc kubenswrapper[4781]: I0227 00:30:18.149148 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.149130117 podStartE2EDuration="2.149130117s" podCreationTimestamp="2026-02-27 00:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:30:18.144297037 +0000 UTC m=+1487.401836611" watchObservedRunningTime="2026-02-27 00:30:18.149130117 +0000 UTC m=+1487.406669671" Feb 27 00:30:19 crc kubenswrapper[4781]: I0227 00:30:19.716927 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:30:19 crc kubenswrapper[4781]: I0227 00:30:19.717255 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 00:30:20 crc kubenswrapper[4781]: I0227 00:30:20.065037 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:20 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:20 crc kubenswrapper[4781]: > Feb 27 00:30:21 crc kubenswrapper[4781]: I0227 00:30:21.624149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 00:30:24 crc kubenswrapper[4781]: I0227 00:30:24.716796 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:30:24 crc kubenswrapper[4781]: I0227 00:30:24.717255 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 00:30:25 crc kubenswrapper[4781]: I0227 00:30:25.732856 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e32c3573-4acb-4d70-aa6e-2d647c108931" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:25 crc kubenswrapper[4781]: I0227 00:30:25.732892 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e32c3573-4acb-4d70-aa6e-2d647c108931" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.543082 4781 scope.go:117] "RemoveContainer" containerID="e7c34540c9407121a9ee96d4e0537e4a13bd65448411272b9cedd072273699e8" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.558342 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.558401 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.625161 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 00:30:26 crc kubenswrapper[4781]: I0227 00:30:26.674282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.284174 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.574771 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e258c11-5caa-4d6b-ab77-841ddf83ac81" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:27 crc kubenswrapper[4781]: I0227 00:30:27.574783 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e258c11-5caa-4d6b-ab77-841ddf83ac81" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 00:30:28 crc kubenswrapper[4781]: I0227 00:30:28.115080 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 27 00:30:30 crc kubenswrapper[4781]: I0227 00:30:30.079337 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:30:30 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:30:30 crc kubenswrapper[4781]: > Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.286192 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerID="12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" exitCode=137 Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.286739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1"} Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.371388 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447120 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447287 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447322 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.447502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") pod \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\" (UID: \"93fc1751-3b02-4d9a-8278-caa0f09f8e9e\") " Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.448445 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.449201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.453827 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl" (OuterVolumeSpecName: "kube-api-access-b5spl") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "kube-api-access-b5spl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.455846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts" (OuterVolumeSpecName: "scripts") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.489554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.530247 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550285 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550324 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550336 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550347 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550360 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.550371 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5spl\" (UniqueName: \"kubernetes.io/projected/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-kube-api-access-b5spl\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.581581 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data" (OuterVolumeSpecName: "config-data") pod "93fc1751-3b02-4d9a-8278-caa0f09f8e9e" (UID: "93fc1751-3b02-4d9a-8278-caa0f09f8e9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.652304 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93fc1751-3b02-4d9a-8278-caa0f09f8e9e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.723926 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.725683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 00:30:34 crc kubenswrapper[4781]: I0227 00:30:34.732487 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.297581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93fc1751-3b02-4d9a-8278-caa0f09f8e9e","Type":"ContainerDied","Data":"2c4686839ed88b8a45c07bbc45e5d7e8f95577bd88f8f5ed02b133c4326a106e"} Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.297650 4781 scope.go:117] "RemoveContainer" containerID="12384cd6d81bca6e23791de7f51204e92fef8abc9d7b3fd6d25b23d6563fe9c1" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.298412 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.305989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.320238 4781 scope.go:117] "RemoveContainer" containerID="fe5f026af9d91c7231233bd71a19d73aa9a872a6aadf224c6825ac237ba54fe7" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.344016 4781 scope.go:117] "RemoveContainer" containerID="7b717e2de43421aa8594d897ad2625a4ab8b03819b1fddaf68c17d412b495f36" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.377674 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.385692 4781 scope.go:117] "RemoveContainer" containerID="6aee62b4a0e99b93b70f6483a48731232b6e0bac4ac727deafa8d7b77d1fa9e5" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.388774 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.463918 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464418 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464439 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464476 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464497 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464537 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: E0227 00:30:35.464549 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464556 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464750 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-central-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="sg-core" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464783 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="proxy-httpd" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.464806 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" containerName="ceilometer-notification-agent" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.467962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.470782 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.471038 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.471366 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.494854 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574584 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574649 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574745 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.574998 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.575052 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677476 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677549 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.677687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.678123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.679816 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.683694 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.684707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.684825 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.685725 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.703549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.704006 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"ceilometer-0\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " pod="openstack/ceilometer-0" Feb 27 00:30:35 crc kubenswrapper[4781]: I0227 00:30:35.786442 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.287318 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.309225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"94c7511afd7913c3a074803cf7f0cdd498d1a4a5ebb3ef1f330c0237d0afa73c"} Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.566718 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.568191 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.569688 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 00:30:36 crc kubenswrapper[4781]: I0227 00:30:36.578169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.321162 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fc1751-3b02-4d9a-8278-caa0f09f8e9e" path="/var/lib/kubelet/pods/93fc1751-3b02-4d9a-8278-caa0f09f8e9e/volumes" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.324221 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.324892 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 00:30:37 crc kubenswrapper[4781]: I0227 00:30:37.331225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 00:30:38 crc kubenswrapper[4781]: I0227 00:30:38.336347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.059933 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.119967 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.301432 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:39 crc kubenswrapper[4781]: I0227 00:30:39.352677 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.363898 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49thr" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" containerID="cri-o://d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" gracePeriod=2 Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.885237 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.994925 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995003 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995051 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") pod \"cc2a000c-f169-4622-8c82-cd4c2baa730a\" (UID: \"cc2a000c-f169-4622-8c82-cd4c2baa730a\") " Feb 27 00:30:40 crc kubenswrapper[4781]: I0227 00:30:40.995727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities" (OuterVolumeSpecName: "utilities") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.000163 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q" (OuterVolumeSpecName: "kube-api-access-dwm7q") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "kube-api-access-dwm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.097330 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwm7q\" (UniqueName: \"kubernetes.io/projected/cc2a000c-f169-4622-8c82-cd4c2baa730a-kube-api-access-dwm7q\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.097358 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.112012 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc2a000c-f169-4622-8c82-cd4c2baa730a" (UID: "cc2a000c-f169-4622-8c82-cd4c2baa730a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.199716 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2a000c-f169-4622-8c82-cd4c2baa730a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.375980 4781 generic.go:334] "Generic (PLEG): container finished" podID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" exitCode=0 Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376015 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49thr" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376021 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49thr" event={"ID":"cc2a000c-f169-4622-8c82-cd4c2baa730a","Type":"ContainerDied","Data":"91f3015c87e164a4f8b695fde0fa0251c13f94e03e4150c65b9ff72adaa25616"} Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.376066 4781 scope.go:117] "RemoveContainer" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.403447 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.409196 4781 scope.go:117] "RemoveContainer" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.416405 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49thr"] Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.446080 4781 scope.go:117] "RemoveContainer" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495402 4781 scope.go:117] "RemoveContainer" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.495904 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": container with ID starting with d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344 not found: ID does not exist" containerID="d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495937 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344"} err="failed to get container status \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": rpc error: code = NotFound desc = could not find container \"d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344\": container with ID starting with d9d953c5a94e7131260a7cfb99d3ce8775a2f8d8e160f41df132bea931da9344 not found: ID does not exist" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.495961 4781 scope.go:117] "RemoveContainer" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.496367 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": container with ID starting with 4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0 not found: ID does not exist" containerID="4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496404 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0"} err="failed to get container status \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": rpc error: code = NotFound desc = could not find container \"4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0\": container with ID starting with 4da71cf49cfa3d081578fb48fc2ef823314656ae8cf39b6aab2eeb0dcab082a0 not found: ID does not exist" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496433 4781 scope.go:117] "RemoveContainer" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: E0227 00:30:41.496847 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": container with ID starting with a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a not found: ID does not exist" containerID="a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a" Feb 27 00:30:41 crc kubenswrapper[4781]: I0227 00:30:41.496869 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a"} err="failed to get container status \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": rpc error: code = NotFound desc = could not find container \"a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a\": container with ID starting with a39372cacd71860e2d25c390539bea5ea2a9770dffde8ab63eabcba10e8b848a not found: ID does not exist" Feb 27 00:30:42 crc kubenswrapper[4781]: I0227 00:30:42.895111 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:30:42 crc kubenswrapper[4781]: I0227 00:30:42.895182 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:30:43 crc kubenswrapper[4781]: I0227 00:30:43.332524 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" path="/var/lib/kubelet/pods/cc2a000c-f169-4622-8c82-cd4c2baa730a/volumes" Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.420247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerStarted","Data":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.420815 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:30:45 crc kubenswrapper[4781]: I0227 00:30:45.444015 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.89654247 podStartE2EDuration="10.4439945s" podCreationTimestamp="2026-02-27 00:30:35 +0000 UTC" firstStartedPulling="2026-02-27 00:30:36.283239641 +0000 UTC m=+1505.540779195" lastFinishedPulling="2026-02-27 00:30:44.830691671 +0000 UTC m=+1514.088231225" observedRunningTime="2026-02-27 00:30:45.44396752 +0000 UTC m=+1514.701507084" watchObservedRunningTime="2026-02-27 00:30:45.4439945 +0000 UTC m=+1514.701534064" Feb 27 00:31:05 crc kubenswrapper[4781]: I0227 00:31:05.815056 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.895307 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.895983 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.896052 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.897194 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:31:12 crc kubenswrapper[4781]: I0227 00:31:12.897310 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" gracePeriod=600 Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745587 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" exitCode=0 Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818"} Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} Feb 27 00:31:13 crc kubenswrapper[4781]: I0227 00:31:13.745980 4781 scope.go:117] "RemoveContainer" containerID="40924ce0e5e04646329cd01d3e3dc65fdaf6b21bdd01704d3fa5ed81c86443f6" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.795766 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.806889 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-l9w6z"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.892379 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.892955 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-utilities" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.892982 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-utilities" Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.893012 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893021 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: E0227 00:31:16.893038 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-content" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893047 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="extract-content" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.893321 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2a000c-f169-4622-8c82-cd4c2baa730a" containerName="registry-server" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.894271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.896585 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.921562 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959310 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:16 crc kubenswrapper[4781]: I0227 00:31:16.959411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061150 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061302 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.061335 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.068702 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.068962 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.069280 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.070045 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.080199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"cloudkitty-db-sync-6wz7g\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.239374 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.322985 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2274af64-0743-4ede-8fb8-e2ed801638ac" path="/var/lib/kubelet/pods/2274af64-0743-4ede-8fb8-e2ed801638ac/volumes" Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.760471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:31:17 crc kubenswrapper[4781]: W0227 00:31:17.775394 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb669382e_dffc_421d_80a3_82b928f54044.slice/crio-bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e WatchSource:0}: Error finding container bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e: Status 404 returned error can't find the container with id bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e Feb 27 00:31:17 crc kubenswrapper[4781]: I0227 00:31:17.799688 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerStarted","Data":"bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e"} Feb 27 00:31:18 crc kubenswrapper[4781]: I0227 00:31:18.808869 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerStarted","Data":"08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0"} Feb 27 00:31:18 crc kubenswrapper[4781]: I0227 00:31:18.842126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-6wz7g" podStartSLOduration=2.690077798 podStartE2EDuration="2.842091685s" podCreationTimestamp="2026-02-27 00:31:16 +0000 UTC" firstStartedPulling="2026-02-27 00:31:17.780142785 +0000 UTC m=+1547.037682339" lastFinishedPulling="2026-02-27 00:31:17.932156672 +0000 UTC m=+1547.189696226" observedRunningTime="2026-02-27 00:31:18.826577576 +0000 UTC m=+1548.084117130" watchObservedRunningTime="2026-02-27 00:31:18.842091685 +0000 UTC m=+1548.099631229" Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.109087 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.159765 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160034 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" containerID="cri-o://9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160080 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" containerID="cri-o://11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160139 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" containerID="cri-o://b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.160304 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" containerID="cri-o://1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" gracePeriod=30 Feb 27 00:31:19 crc kubenswrapper[4781]: E0227 00:31:19.769774 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277c2e9c_3c87_442a_b5f6_52f1d63c24e9.slice/crio-9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277c2e9c_3c87_442a_b5f6_52f1d63c24e9.slice/crio-conmon-9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832357 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" exitCode=0 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832394 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" exitCode=2 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832402 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" exitCode=0 Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832593 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} Feb 27 00:31:19 crc kubenswrapper[4781]: I0227 00:31:19.832605 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.129517 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.737653 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842844 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842942 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.842984 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843018 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843068 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843183 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.843293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") pod \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\" (UID: \"277c2e9c-3c87-442a-b5f6-52f1d63c24e9\") " Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.846559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.846887 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.873134 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts" (OuterVolumeSpecName: "scripts") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.873193 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk" (OuterVolumeSpecName: "kube-api-access-njzlk") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "kube-api-access-njzlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.885923 4781 generic.go:334] "Generic (PLEG): container finished" podID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" exitCode=0 Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.885984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886010 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"277c2e9c-3c87-442a-b5f6-52f1d63c24e9","Type":"ContainerDied","Data":"94c7511afd7913c3a074803cf7f0cdd498d1a4a5ebb3ef1f330c0237d0afa73c"} Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886026 4781 scope.go:117] "RemoveContainer" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.886197 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.887852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.945054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946275 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946321 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzlk\" (UniqueName: \"kubernetes.io/projected/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-kube-api-access-njzlk\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946333 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946344 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946356 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:20 crc kubenswrapper[4781]: I0227 00:31:20.946367 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.017302 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.020519 4781 scope.go:117] "RemoveContainer" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.054225 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.055455 4781 scope.go:117] "RemoveContainer" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.079755 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data" (OuterVolumeSpecName: "config-data") pod "277c2e9c-3c87-442a-b5f6-52f1d63c24e9" (UID: "277c2e9c-3c87-442a-b5f6-52f1d63c24e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.091684 4781 scope.go:117] "RemoveContainer" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.132821 4781 scope.go:117] "RemoveContainer" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.137059 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": container with ID starting with 11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4 not found: ID does not exist" containerID="11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137109 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4"} err="failed to get container status \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": rpc error: code = NotFound desc = could not find container \"11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4\": container with ID starting with 11c44c35194959cc78dca0db7f484fb501625e2e603b3f58466927584ff998a4 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137134 4781 scope.go:117] "RemoveContainer" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.137802 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": container with ID starting with 1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f not found: ID does not exist" containerID="1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.137911 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f"} err="failed to get container status \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": rpc error: code = NotFound desc = could not find container \"1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f\": container with ID starting with 1b0cdadfca07605835d8748d2c2b00aa7df4e4445c1a8c1212419d7d09db934f not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.138003 4781 scope.go:117] "RemoveContainer" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.142146 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": container with ID starting with b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5 not found: ID does not exist" containerID="b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142247 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5"} err="failed to get container status \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": rpc error: code = NotFound desc = could not find container \"b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5\": container with ID starting with b624b6912b0767e231affc3c40d0f364ae69e607a4f319243ac55c5714e52ed5 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142311 4781 scope.go:117] "RemoveContainer" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.142579 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": container with ID starting with 9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51 not found: ID does not exist" containerID="9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.142674 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51"} err="failed to get container status \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": rpc error: code = NotFound desc = could not find container \"9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51\": container with ID starting with 9f4615d95e6e63d1efd4e66baa23b1f80ac3bd3f83b086b7f16e524d977e4e51 not found: ID does not exist" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.159289 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/277c2e9c-3c87-442a-b5f6-52f1d63c24e9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.227783 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.244304 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.288559 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.288991 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289008 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289026 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289032 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289042 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289051 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: E0227 00:31:21.289069 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289074 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289260 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-central-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289276 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="ceilometer-notification-agent" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289286 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="sg-core" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.289297 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" containerName="proxy-httpd" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.291074 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.297325 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.297724 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.298191 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.322608 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277c2e9c-3c87-442a-b5f6-52f1d63c24e9" path="/var/lib/kubelet/pods/277c2e9c-3c87-442a-b5f6-52f1d63c24e9/volumes" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.323520 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362854 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.362900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464420 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464941 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.464987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465038 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-run-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.465472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-log-httpd\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.471345 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.474195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.475183 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-scripts\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.476193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.482058 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-config-data\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.491437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8mz\" (UniqueName: \"kubernetes.io/projected/4f5736d7-ab3f-41d9-b5ec-94da30e708f1-kube-api-access-8g8mz\") pod \"ceilometer-0\" (UID: \"4f5736d7-ab3f-41d9-b5ec-94da30e708f1\") " pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.609124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.907145 4781 generic.go:334] "Generic (PLEG): container finished" podID="b669382e-dffc-421d-80a3-82b928f54044" containerID="08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0" exitCode=0 Feb 27 00:31:21 crc kubenswrapper[4781]: I0227 00:31:21.907219 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerDied","Data":"08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0"} Feb 27 00:31:22 crc kubenswrapper[4781]: I0227 00:31:22.179740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 00:31:22 crc kubenswrapper[4781]: I0227 00:31:22.953505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"01d7a9064d9fb0090af236b1310f46844f7df89bee45fc33b93891145fc80815"} Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.473342 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616215 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.616418 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") pod \"b669382e-dffc-421d-80a3-82b928f54044\" (UID: \"b669382e-dffc-421d-80a3-82b928f54044\") " Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.623498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts" (OuterVolumeSpecName: "scripts") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.623821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs" (OuterVolumeSpecName: "certs") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.639501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz" (OuterVolumeSpecName: "kube-api-access-w2tjz") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "kube-api-access-w2tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.669139 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.690105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data" (OuterVolumeSpecName: "config-data") pod "b669382e-dffc-421d-80a3-82b928f54044" (UID: "b669382e-dffc-421d-80a3-82b928f54044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721812 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721854 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2tjz\" (UniqueName: \"kubernetes.io/projected/b669382e-dffc-421d-80a3-82b928f54044-kube-api-access-w2tjz\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721868 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721879 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.721890 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669382e-dffc-421d-80a3-82b928f54044-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6wz7g" event={"ID":"b669382e-dffc-421d-80a3-82b928f54044","Type":"ContainerDied","Data":"bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e"} Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966662 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0ab0e9643525093c3faf04cca90f385efba4046632e8a89c2d0d13c194fa6e" Feb 27 00:31:23 crc kubenswrapper[4781]: I0227 00:31:23.966692 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6wz7g" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.211213 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" containerID="cri-o://dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" gracePeriod=604794 Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.236775 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:25 crc kubenswrapper[4781]: E0227 00:31:25.239061 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.239091 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.239306 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b669382e-dffc-421d-80a3-82b928f54044" containerName="cloudkitty-db-sync" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.240009 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.242292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.269292 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.273770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.281657 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.290129 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-g672n"] Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.321857 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b3198c-30ab-415a-b24b-b26ab3da838e" path="/var/lib/kubelet/pods/87b3198c-30ab-415a-b24b-b26ab3da838e/volumes" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375695 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375909 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375932 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.375989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.385527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.387838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.389706 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.400171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.411158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"cloudkitty-storageinit-c5vn9\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.570574 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:25 crc kubenswrapper[4781]: I0227 00:31:25.905941 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" containerID="cri-o://84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" gracePeriod=604795 Feb 27 00:31:26 crc kubenswrapper[4781]: I0227 00:31:26.351925 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:31:26 crc kubenswrapper[4781]: I0227 00:31:26.600484 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.258369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerStarted","Data":"4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.258655 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerStarted","Data":"c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.260116 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"5fba32f66c88e72d5dff32f9fc4c8c9e3acbeab261897ce7904168caa209899e"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.260138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"54d4957f3fe1d6d4bccbcd59c6c99d49bb1a6b6984834e6da5136a16d63d4bcd"} Feb 27 00:31:27 crc kubenswrapper[4781]: I0227 00:31:27.284559 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-c5vn9" podStartSLOduration=3.284540779 podStartE2EDuration="3.284540779s" podCreationTimestamp="2026-02-27 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:27.279075001 +0000 UTC m=+1556.536614555" watchObservedRunningTime="2026-02-27 00:31:27.284540779 +0000 UTC m=+1556.542080333" Feb 27 00:31:28 crc kubenswrapper[4781]: I0227 00:31:28.272923 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"e0a664c2f58126375bd72172545af9a09ea54c63d627e55c8853f530614522f6"} Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.285837 4781 generic.go:334] "Generic (PLEG): container finished" podID="fee23b33-5d55-45c9-b024-0b4865019095" containerID="4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65" exitCode=0 Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.285918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerDied","Data":"4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65"} Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.362758 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Feb 27 00:31:29 crc kubenswrapper[4781]: I0227 00:31:29.758786 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.321474 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f5736d7-ab3f-41d9-b5ec-94da30e708f1","Type":"ContainerStarted","Data":"9a972aade56daf285cc3b78040b476a15071b5264e6b20ac696006680085dedd"} Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.363556 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.788936294 podStartE2EDuration="9.363533161s" podCreationTimestamp="2026-02-27 00:31:21 +0000 UTC" firstStartedPulling="2026-02-27 00:31:22.182731597 +0000 UTC m=+1551.440271151" lastFinishedPulling="2026-02-27 00:31:29.757328464 +0000 UTC m=+1559.014868018" observedRunningTime="2026-02-27 00:31:30.347888948 +0000 UTC m=+1559.605428532" watchObservedRunningTime="2026-02-27 00:31:30.363533161 +0000 UTC m=+1559.621072725" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.860044 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989966 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.989997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.990061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.990211 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") pod \"fee23b33-5d55-45c9-b024-0b4865019095\" (UID: \"fee23b33-5d55-45c9-b024-0b4865019095\") " Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.995995 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl" (OuterVolumeSpecName: "kube-api-access-cpvfl") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "kube-api-access-cpvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:30 crc kubenswrapper[4781]: I0227 00:31:30.996431 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs" (OuterVolumeSpecName: "certs") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.010846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts" (OuterVolumeSpecName: "scripts") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.020252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data" (OuterVolumeSpecName: "config-data") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.040959 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee23b33-5d55-45c9-b024-0b4865019095" (UID: "fee23b33-5d55-45c9-b024-0b4865019095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093808 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093846 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093858 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee23b33-5d55-45c9-b024-0b4865019095-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093887 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.093896 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpvfl\" (UniqueName: \"kubernetes.io/projected/fee23b33-5d55-45c9-b024-0b4865019095-kube-api-access-cpvfl\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.337541 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-c5vn9" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.337999 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.338030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-c5vn9" event={"ID":"fee23b33-5d55-45c9-b024-0b4865019095","Type":"ContainerDied","Data":"c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de"} Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.338053 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de" Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.526436 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.526684 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" containerID="cri-o://b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" gracePeriod=30 Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.559879 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.560117 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" containerID="cri-o://d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" gracePeriod=30 Feb 27 00:31:31 crc kubenswrapper[4781]: I0227 00:31:31.560239 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" containerID="cri-o://e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" gracePeriod=30 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.205730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224877 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224957 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.224996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225029 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225148 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225818 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225910 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.225935 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") pod \"919ba171-1971-416c-99c1-5dfcacc10a28\" (UID: \"919ba171-1971-416c-99c1-5dfcacc10a28\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.226486 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.226997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.231309 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.235878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.249846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info" (OuterVolumeSpecName: "pod-info") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.243978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq" (OuterVolumeSpecName: "kube-api-access-tf9tq") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "kube-api-access-tf9tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.289743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.292907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc" (OuterVolumeSpecName: "persistence") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "pvc-32c98d96-9f26-419a-9095-7dcb737794cc". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328293 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328319 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328329 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328339 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf9tq\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-kube-api-access-tf9tq\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328348 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328355 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/919ba171-1971-416c-99c1-5dfcacc10a28-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328365 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/919ba171-1971-416c-99c1-5dfcacc10a28-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.328386 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") on node \"crc\" " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.371486 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf" (OuterVolumeSpecName: "server-conf") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.432337 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.433201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data" (OuterVolumeSpecName: "config-data") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.433439 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.434123 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-32c98d96-9f26-419a-9095-7dcb737794cc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc") on node "crc" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.466970 4781 generic.go:334] "Generic (PLEG): container finished" podID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerID="b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.467421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerDied","Data":"b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.502988 4781 generic.go:334] "Generic (PLEG): container finished" podID="919ba171-1971-416c-99c1-5dfcacc10a28" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503070 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"919ba171-1971-416c-99c1-5dfcacc10a28","Type":"ContainerDied","Data":"f58e1ef93098c46c57b5e59fd849c5fcd9c3a1bc9f7c9d503b32be5e67364d02"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503085 4781 scope.go:117] "RemoveContainer" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.503189 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.531602 4781 generic.go:334] "Generic (PLEG): container finished" podID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" exitCode=143 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.531707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.535495 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.535524 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/919ba171-1971-416c-99c1-5dfcacc10a28-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.545875 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "919ba171-1971-416c-99c1-5dfcacc10a28" (UID: "919ba171-1971-416c-99c1-5dfcacc10a28"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.545961 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerID="84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" exitCode=0 Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.546742 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984"} Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.597159 4781 scope.go:117] "RemoveContainer" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.619702 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.637038 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/919ba171-1971-416c-99c1-5dfcacc10a28-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.668746 4781 scope.go:117] "RemoveContainer" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: E0227 00:31:32.669608 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": container with ID starting with dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f not found: ID does not exist" containerID="dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669652 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f"} err="failed to get container status \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": rpc error: code = NotFound desc = could not find container \"dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f\": container with ID starting with dba61167d0afe993e4003914cea9d9f848cdf347ff3ec2b7de108d354e6bd19f not found: ID does not exist" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669677 4781 scope.go:117] "RemoveContainer" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: E0227 00:31:32.669861 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": container with ID starting with 96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7 not found: ID does not exist" containerID="96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.669876 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7"} err="failed to get container status \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": rpc error: code = NotFound desc = could not find container \"96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7\": container with ID starting with 96bdfdf43af7fb258b62a9d9f821ba1ad93ea827a8cff8b6d290f5066fe5b0b7 not found: ID does not exist" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.750818 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.750965 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751013 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751043 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.751111 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752486 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752532 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.752565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753023 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753115 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") pod \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\" (UID: \"c7ca2a9f-a42e-4d9b-89a7-f2590842f328\") " Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.753993 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.766043 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info" (OuterVolumeSpecName: "pod-info") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.768051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.768502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.780820 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b" (OuterVolumeSpecName: "kube-api-access-dnr8b") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "kube-api-access-dnr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.787549 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.839895 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868162 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868222 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868235 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnr8b\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-kube-api-access-dnr8b\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868250 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868261 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.868272 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.935680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data" (OuterVolumeSpecName: "config-data") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:32 crc kubenswrapper[4781]: I0227 00:31:32.975038 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.016577 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.049493 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109260 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109785 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109808 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109816 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109823 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109835 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109842 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109857 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109863 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="setup-container" Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.109873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.109879 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110063 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110076 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee23b33-5d55-45c9-b024-0b4865019095" containerName="cloudkitty-storageinit" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.110089 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" containerName="rabbitmq" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.112937 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122027 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jcfdg" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122222 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122368 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122469 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122706 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.122804 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.132930 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.150725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf" (OuterVolumeSpecName: "server-conf") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.152241 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb" (OuterVolumeSpecName: "persistence") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "pvc-78362793-d2f7-4c5f-943c-efd8f93773cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.159780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c7ca2a9f-a42e-4d9b-89a7-f2590842f328" (UID: "c7ca2a9f-a42e-4d9b-89a7-f2590842f328"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186248 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186281 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7ca2a9f-a42e-4d9b-89a7-f2590842f328-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.186316 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") on node \"crc\" " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.238286 4781 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.238440 4781 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-78362793-d2f7-4c5f-943c-efd8f93773cb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb") on node "crc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.252197 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291168 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291343 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291366 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.291749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292192 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.292411 4781 reconciler_common.go:293] "Volume detached for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.323110 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919ba171-1971-416c-99c1-5dfcacc10a28" path="/var/lib/kubelet/pods/919ba171-1971-416c-99c1-5dfcacc10a28/volumes" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.393971 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394002 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394066 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394088 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") pod \"db34a476-dd22-4085-bb2c-a8e57b0d9889\" (UID: \"db34a476-dd22-4085-bb2c-a8e57b0d9889\") " Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.394859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395363 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395432 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.395648 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.396353 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.396505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.397037 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.398537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.399934 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400463 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a1416593fd912ec74c6e12871251980e537685bd157bf8eba211fce64d9b048a/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed38e2f2-b350-4abd-abe2-859c9d504aa8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.400979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed38e2f2-b350-4abd-abe2-859c9d504aa8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts" (OuterVolumeSpecName: "scripts") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7" (OuterVolumeSpecName: "kube-api-access-qwnr7") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "kube-api-access-qwnr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.401896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs" (OuterVolumeSpecName: "certs") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.402372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.416192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed38e2f2-b350-4abd-abe2-859c9d504aa8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.416678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhzw9\" (UniqueName: \"kubernetes.io/projected/ed38e2f2-b350-4abd-abe2-859c9d504aa8-kube-api-access-qhzw9\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.445564 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32c98d96-9f26-419a-9095-7dcb737794cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32c98d96-9f26-419a-9095-7dcb737794cc\") pod \"rabbitmq-server-0\" (UID: \"ed38e2f2-b350-4abd-abe2-859c9d504aa8\") " pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.445821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data" (OuterVolumeSpecName: "config-data") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.454019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db34a476-dd22-4085-bb2c-a8e57b0d9889" (UID: "db34a476-dd22-4085-bb2c-a8e57b0d9889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498242 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498281 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwnr7\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-kube-api-access-qwnr7\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498294 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/db34a476-dd22-4085-bb2c-a8e57b0d9889-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498305 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498317 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.498329 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db34a476-dd22-4085-bb2c-a8e57b0d9889-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.548753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"db34a476-dd22-4085-bb2c-a8e57b0d9889","Type":"ContainerDied","Data":"3977a787677f2d7efa8ed48ba20f2b14940f8edb53792eab26f57aa6da59e0d1"} Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558189 4781 scope.go:117] "RemoveContainer" containerID="b68ba0c8681372d6b46f2eda69405ebdb37954378e7c21495676222df54a1d3a" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.558185 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.562212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c7ca2a9f-a42e-4d9b-89a7-f2590842f328","Type":"ContainerDied","Data":"231128a0e69346808037ae68c6b271f51f365db9ad7d7761da0f5c52d5d3f07d"} Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.562306 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.692604 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.703575 4781 scope.go:117] "RemoveContainer" containerID="84e4c6c19d757fd81ef5f856104b51d9057ffe90f91b0313f39e58f7d670a984" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.758035 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.765733 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.788018 4781 scope.go:117] "RemoveContainer" containerID="592b25e10dba92f06ec6db612c25fdc12d9afc456496a972e547225b9ac93f91" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.810662 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.822310 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: E0227 00:31:33.822802 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.822821 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.823049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" containerName="cloudkitty-proc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.825273 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.833348 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.858349 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.860690 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.862859 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863110 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.863392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.864210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t6n8b" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.865681 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.866055 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.868689 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.892034 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:33 crc kubenswrapper[4781]: I0227 00:31:33.973293 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014747 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.014862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015027 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015095 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015231 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015354 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015484 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015527 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015563 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015621 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015694 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.015711 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117279 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117346 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117402 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117433 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117463 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117545 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117641 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") pod \"75721c64-91e7-468b-8157-9f7b0f8060b0\" (UID: \"75721c64-91e7-468b-8157-9f7b0f8060b0\") " Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117889 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.117948 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118561 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118609 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118640 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118688 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118707 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.118745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.123997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs" (OuterVolumeSpecName: "logs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.125944 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.126471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.126753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.127055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37519387-1738-4500-9953-52deba3e4a85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.127269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.128147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.140412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts" (OuterVolumeSpecName: "scripts") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.140894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37519387-1738-4500-9953-52deba3e4a85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.142151 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-certs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.142793 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-scripts\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.143516 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.143542 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2eaf337fb87b6a71958dbd52c87dbf5c448ea95938dfd82cb1cc22a9e40efc9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.145657 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.146433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.147257 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs" (OuterVolumeSpecName: "certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148225 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37519387-1738-4500-9953-52deba3e4a85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148551 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.148797 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k" (OuterVolumeSpecName: "kube-api-access-ptk2k") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "kube-api-access-ptk2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.149181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.149255 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4c3569-6860-4c2a-8923-42e436279a11-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.152935 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxmc9\" (UniqueName: \"kubernetes.io/projected/37519387-1738-4500-9953-52deba3e4a85-kube-api-access-dxmc9\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.163220 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nqs\" (UniqueName: \"kubernetes.io/projected/cf4c3569-6860-4c2a-8923-42e436279a11-kube-api-access-x8nqs\") pod \"cloudkitty-proc-0\" (UID: \"cf4c3569-6860-4c2a-8923-42e436279a11\") " pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.215954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220520 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220549 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220558 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220566 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220574 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptk2k\" (UniqueName: \"kubernetes.io/projected/75721c64-91e7-468b-8157-9f7b0f8060b0-kube-api-access-ptk2k\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.220585 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75721c64-91e7-468b-8157-9f7b0f8060b0-logs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.231811 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.250449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.273994 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-78362793-d2f7-4c5f-943c-efd8f93773cb\") pod \"rabbitmq-cell1-server-0\" (UID: \"37519387-1738-4500-9953-52deba3e4a85\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.283870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data" (OuterVolumeSpecName: "config-data") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.300847 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75721c64-91e7-468b-8157-9f7b0f8060b0" (UID: "75721c64-91e7-468b-8157-9f7b0f8060b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.323356 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.323485 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.324194 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75721c64-91e7-468b-8157-9f7b0f8060b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.453723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.490955 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.615538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"2352a458a3fa8043406f44144b7eb6d0f2fae518c516b02b5cff94fb50ca50fa"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.658686 4781 scope.go:117] "RemoveContainer" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.661757 4781 generic.go:334] "Generic (PLEG): container finished" podID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" exitCode=0 Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.661818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"75721c64-91e7-468b-8157-9f7b0f8060b0","Type":"ContainerDied","Data":"27f0f2f53c09daadc606bc872e1f5df520a0c8f2a01549f894ec755d7a09a157"} Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.692271 4781 scope.go:117] "RemoveContainer" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.728944 4781 scope.go:117] "RemoveContainer" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.731736 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": container with ID starting with e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a not found: ID does not exist" containerID="e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.731769 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a"} err="failed to get container status \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": rpc error: code = NotFound desc = could not find container \"e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a\": container with ID starting with e213bc3be73bd79fd15b3c136a6bd5766ca6edcbbbd34d62e83bc41711b9178a not found: ID does not exist" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.731791 4781 scope.go:117] "RemoveContainer" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.735923 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": container with ID starting with d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241 not found: ID does not exist" containerID="d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.735955 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241"} err="failed to get container status \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": rpc error: code = NotFound desc = could not find container \"d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241\": container with ID starting with d2ded5bf94c8c14f72027674825df34d27ac9ed838299103764bc5aba595b241 not found: ID does not exist" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.735998 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.755912 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769180 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.769763 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769782 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: E0227 00:31:34.769827 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.769833 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.770024 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.770039 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" containerName="cloudkitty-api-log" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.771214 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774057 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774241 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.774340 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.789792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.830096 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.832887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.840899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.851851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852165 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852476 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852637 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852735 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852876 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.852944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.897708 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958079 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958293 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958333 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958351 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958369 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.958932 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ad9523-5281-4d1c-a9d5-92982905d525-logs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.970254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-scripts\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.971052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.981208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.991871 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-config-data\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.992252 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.992772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:34 crc kubenswrapper[4781]: I0227 00:31:34.995918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ad9523-5281-4d1c-a9d5-92982905d525-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.004271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hkk\" (UniqueName: \"kubernetes.io/projected/a7ad9523-5281-4d1c-a9d5-92982905d525-kube-api-access-q9hkk\") pod \"cloudkitty-api-0\" (UID: \"a7ad9523-5281-4d1c-a9d5-92982905d525\") " pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.062991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063122 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063205 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.063270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.064129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.065490 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.065984 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.066041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.068226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.068435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.071969 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.092569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"dnsmasq-dns-dbb88bf8c-982rb\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.111139 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.203054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.228731 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.323809 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75721c64-91e7-468b-8157-9f7b0f8060b0" path="/var/lib/kubelet/pods/75721c64-91e7-468b-8157-9f7b0f8060b0/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.324831 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ca2a9f-a42e-4d9b-89a7-f2590842f328" path="/var/lib/kubelet/pods/c7ca2a9f-a42e-4d9b-89a7-f2590842f328/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.326656 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db34a476-dd22-4085-bb2c-a8e57b0d9889" path="/var/lib/kubelet/pods/db34a476-dd22-4085-bb2c-a8e57b0d9889/volumes" Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.599715 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.678829 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"7b2bc74dad8d8a36748bc47857d2093994bda1653fc9f0dd1fdc558a4806b28f"} Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.682166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"cf4c3569-6860-4c2a-8923-42e436279a11","Type":"ContainerStarted","Data":"15f5de16a32527cfaf74ad662d3f3049ea76aad5457fd4357ff0db16b4599bf4"} Feb 27 00:31:35 crc kubenswrapper[4781]: W0227 00:31:35.706804 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7ad9523_5281_4d1c_a9d5_92982905d525.slice/crio-0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11 WatchSource:0}: Error finding container 0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11: Status 404 returned error can't find the container with id 0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11 Feb 27 00:31:35 crc kubenswrapper[4781]: I0227 00:31:35.869239 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699151 4781 generic.go:334] "Generic (PLEG): container finished" podID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" exitCode=0 Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.699509 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerStarted","Data":"68785af84dc6132ad668c9748f55cbc0790b34d7e605682887df4cda02988cd2"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.717119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"4db093f6d8dc270dfe390ce7dc919fc13daaac34b82c5bce418d56fe704e73aa"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"d2adb80aec2419491004b0d99da92c28ca8237830402ae09e31cc85b3c9be10b"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.726117 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"a7ad9523-5281-4d1c-a9d5-92982905d525","Type":"ContainerStarted","Data":"0c79a857e65c1b2a45875ccb98d3fdf7c4e5a3bbcc5c33cf6ee46d5709600a11"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.727211 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.735273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"cf4c3569-6860-4c2a-8923-42e436279a11","Type":"ContainerStarted","Data":"5c56aa725d72412984d893e2e1e07b2db369fa8fbd72a3dd7aab740b2a509825"} Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.796375 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.796355771 podStartE2EDuration="2.796355771s" podCreationTimestamp="2026-02-27 00:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:36.777273686 +0000 UTC m=+1566.034813260" watchObservedRunningTime="2026-02-27 00:31:36.796355771 +0000 UTC m=+1566.053895325" Feb 27 00:31:36 crc kubenswrapper[4781]: I0227 00:31:36.808966 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.558241368 podStartE2EDuration="3.808949401s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="2026-02-27 00:31:35.066097657 +0000 UTC m=+1564.323637211" lastFinishedPulling="2026-02-27 00:31:35.31680569 +0000 UTC m=+1564.574345244" observedRunningTime="2026-02-27 00:31:36.794157462 +0000 UTC m=+1566.051697016" watchObservedRunningTime="2026-02-27 00:31:36.808949401 +0000 UTC m=+1566.066488955" Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.745894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9"} Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.752556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerStarted","Data":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.752596 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:37 crc kubenswrapper[4781]: I0227 00:31:37.821968 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" podStartSLOduration=3.821947659 podStartE2EDuration="3.821947659s" podCreationTimestamp="2026-02-27 00:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:37.806403939 +0000 UTC m=+1567.063943503" watchObservedRunningTime="2026-02-27 00:31:37.821947659 +0000 UTC m=+1567.079487213" Feb 27 00:31:40 crc kubenswrapper[4781]: E0227 00:31:40.320647 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.205276 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.278788 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.279057 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" containerID="cri-o://e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" gracePeriod=10 Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.457950 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.459786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.473704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499216 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499318 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499376 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.499448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600077 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600164 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600268 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.600976 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-svc\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601511 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.601517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-config\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.602416 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.624405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4fg\" (UniqueName: \"kubernetes.io/projected/f01f0f26-7e7a-464f-8f50-4d49bf87cb46-kube-api-access-gq4fg\") pod \"dnsmasq-dns-85f64749dc-9drr8\" (UID: \"f01f0f26-7e7a-464f-8f50-4d49bf87cb46\") " pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.785353 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.842759 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerID="e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" exitCode=0 Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.842814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af"} Feb 27 00:31:45 crc kubenswrapper[4781]: I0227 00:31:45.962716 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115405 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115478 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.115832 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") pod \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\" (UID: \"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e\") " Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.132329 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr" (OuterVolumeSpecName: "kube-api-access-cwcnr") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "kube-api-access-cwcnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.203558 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config" (OuterVolumeSpecName: "config") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.218851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223213 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwcnr\" (UniqueName: \"kubernetes.io/projected/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-kube-api-access-cwcnr\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223244 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.223257 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.236198 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.264006 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.275178 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" (UID: "f8bd7379-6c29-4c0f-bb7e-14c18f98a18e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324909 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324944 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.324955 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.330012 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-9drr8"] Feb 27 00:31:46 crc kubenswrapper[4781]: W0227 00:31:46.330245 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01f0f26_7e7a_464f_8f50_4d49bf87cb46.slice/crio-c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e WatchSource:0}: Error finding container c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e: Status 404 returned error can't find the container with id c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" event={"ID":"f8bd7379-6c29-4c0f-bb7e-14c18f98a18e","Type":"ContainerDied","Data":"bc2ce3ec147ae0ea063d3bc5998697394125f3fd2381d07ae16cdf6df5227b71"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854195 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-hm24r" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.854546 4781 scope.go:117] "RemoveContainer" containerID="e59534a993c981832971b7ef17c8f1e9f9d24b23112b98369b8ccf0ba58923af" Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856116 4781 generic.go:334] "Generic (PLEG): container finished" podID="f01f0f26-7e7a-464f-8f50-4d49bf87cb46" containerID="222056db10424527f55219b5eb2c847209139d1f76c50f383e85073a4ddf04ff" exitCode=0 Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerDied","Data":"222056db10424527f55219b5eb2c847209139d1f76c50f383e85073a4ddf04ff"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.856176 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerStarted","Data":"c9e9d5f5376699e834cdeb5a1d46d883ebb4c9dad98d3930976901abe0fd539e"} Feb 27 00:31:46 crc kubenswrapper[4781]: I0227 00:31:46.887724 4781 scope.go:117] "RemoveContainer" containerID="0477def692642480b7baa681e79da18341ef273274b3570944d4f51dd3971947" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.069134 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.082254 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-hm24r"] Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.324853 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" path="/var/lib/kubelet/pods/f8bd7379-6c29-4c0f-bb7e-14c18f98a18e/volumes" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.867855 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" event={"ID":"f01f0f26-7e7a-464f-8f50-4d49bf87cb46","Type":"ContainerStarted","Data":"c9ec13fed5c820c01f8981c2700184d9e0874c5b37cbe966d1fb30046ce9b5df"} Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.868479 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:47 crc kubenswrapper[4781]: I0227 00:31:47.887385 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" podStartSLOduration=2.887367998 podStartE2EDuration="2.887367998s" podCreationTimestamp="2026-02-27 00:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:31:47.883338989 +0000 UTC m=+1577.140878543" watchObservedRunningTime="2026-02-27 00:31:47.887367998 +0000 UTC m=+1577.144907552" Feb 27 00:31:50 crc kubenswrapper[4781]: E0227 00:31:50.627043 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:31:51 crc kubenswrapper[4781]: I0227 00:31:51.620865 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.787815 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-9drr8" Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.858151 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:55 crc kubenswrapper[4781]: I0227 00:31:55.858922 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" containerID="cri-o://f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" gracePeriod=10 Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.518990 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649307 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649525 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.649838 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") pod \"f81105ac-48e2-4b90-820a-8d7758ad3b33\" (UID: \"f81105ac-48e2-4b90-820a-8d7758ad3b33\") " Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.664909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6" (OuterVolumeSpecName: "kube-api-access-9sdc6") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "kube-api-access-9sdc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.727118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.733929 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.738348 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.753022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755086 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755125 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755137 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755146 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.755156 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sdc6\" (UniqueName: \"kubernetes.io/projected/f81105ac-48e2-4b90-820a-8d7758ad3b33-kube-api-access-9sdc6\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.767290 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.767595 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config" (OuterVolumeSpecName: "config") pod "f81105ac-48e2-4b90-820a-8d7758ad3b33" (UID: "f81105ac-48e2-4b90-820a-8d7758ad3b33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.857366 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.857407 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f81105ac-48e2-4b90-820a-8d7758ad3b33-config\") on node \"crc\" DevicePath \"\"" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979757 4781 generic.go:334] "Generic (PLEG): container finished" podID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" exitCode=0 Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979849 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" event={"ID":"f81105ac-48e2-4b90-820a-8d7758ad3b33","Type":"ContainerDied","Data":"68785af84dc6132ad668c9748f55cbc0790b34d7e605682887df4cda02988cd2"} Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979864 4781 scope.go:117] "RemoveContainer" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:56 crc kubenswrapper[4781]: I0227 00:31:56.979999 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-982rb" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.040605 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.040940 4781 scope.go:117] "RemoveContainer" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.052202 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-982rb"] Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.075554 4781 scope.go:117] "RemoveContainer" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:57 crc kubenswrapper[4781]: E0227 00:31:57.076034 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": container with ID starting with f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21 not found: ID does not exist" containerID="f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076099 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21"} err="failed to get container status \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": rpc error: code = NotFound desc = could not find container \"f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21\": container with ID starting with f246e70f323e7906b0d64d2b28a0fd6bc0b8cffd45def7a74d349ce046b56b21 not found: ID does not exist" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076125 4781 scope.go:117] "RemoveContainer" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: E0227 00:31:57.076453 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": container with ID starting with f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583 not found: ID does not exist" containerID="f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.076485 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583"} err="failed to get container status \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": rpc error: code = NotFound desc = could not find container \"f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583\": container with ID starting with f92462c24cf34832cfbb0af5ec14d00e7a6663e73c5d3fb868025da2177d2583 not found: ID does not exist" Feb 27 00:31:57 crc kubenswrapper[4781]: I0227 00:31:57.320145 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" path="/var/lib/kubelet/pods/f81105ac-48e2-4b90-820a-8d7758ad3b33/volumes" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.153934 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154898 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154914 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154947 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154953 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154971 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154978 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="init" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.154987 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.154993 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155169 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bd7379-6c29-4c0f-bb7e-14c18f98a18e" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155180 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81105ac-48e2-4b90-820a-8d7758ad3b33" containerName="dnsmasq-dns" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.155977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.158355 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.158564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.164346 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.177282 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.227753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.330165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.347813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"auto-csr-approver-29535872-fpvhr\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.490181 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:00 crc kubenswrapper[4781]: E0227 00:32:00.920705 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:00 crc kubenswrapper[4781]: I0227 00:32:00.971271 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:32:01 crc kubenswrapper[4781]: I0227 00:32:01.025242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerStarted","Data":"9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5"} Feb 27 00:32:03 crc kubenswrapper[4781]: I0227 00:32:03.047350 4781 generic.go:334] "Generic (PLEG): container finished" podID="28ad6440-a4bb-43a6-985a-42979a799437" containerID="90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89" exitCode=0 Feb 27 00:32:03 crc kubenswrapper[4781]: I0227 00:32:03.047418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerDied","Data":"90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89"} Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.529604 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.531177 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.534361 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.536202 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.536424 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.538400 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.548647 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.583048 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.621898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.622074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723531 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") pod \"28ad6440-a4bb-43a6-985a-42979a799437\" (UID: \"28ad6440-a4bb-43a6-985a-42979a799437\") " Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.723983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.724066 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.724089 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.729649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.730304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.735204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.744906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2" (OuterVolumeSpecName: "kube-api-access-cvkp2") pod "28ad6440-a4bb-43a6-985a-42979a799437" (UID: "28ad6440-a4bb-43a6-985a-42979a799437"). InnerVolumeSpecName "kube-api-access-cvkp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.746807 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.826732 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkp2\" (UniqueName: \"kubernetes.io/projected/28ad6440-a4bb-43a6-985a-42979a799437-kube-api-access-cvkp2\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:04 crc kubenswrapper[4781]: I0227 00:32:04.890824 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079592 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" event={"ID":"28ad6440-a4bb-43a6-985a-42979a799437","Type":"ContainerDied","Data":"9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5"} Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079663 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e91a53fa0d7b46fa4d6db2c1af114047a98ba8b5905295dec631c90cc238eb5" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.079730 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535872-fpvhr" Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.562707 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt"] Feb 27 00:32:05 crc kubenswrapper[4781]: W0227 00:32:05.565106 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05795337_1929_47d6_b63f_96d078b66c47.slice/crio-758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2 WatchSource:0}: Error finding container 758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2: Status 404 returned error can't find the container with id 758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2 Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.707688 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:32:05 crc kubenswrapper[4781]: I0227 00:32:05.721752 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535866-qpv8l"] Feb 27 00:32:06 crc kubenswrapper[4781]: I0227 00:32:06.090523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerStarted","Data":"758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2"} Feb 27 00:32:07 crc kubenswrapper[4781]: I0227 00:32:07.329995 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4edac-acb6-4906-9b3b-42b7c7a98943" path="/var/lib/kubelet/pods/1fe4edac-acb6-4906-9b3b-42b7c7a98943/volumes" Feb 27 00:32:08 crc kubenswrapper[4781]: I0227 00:32:08.124703 4781 generic.go:334] "Generic (PLEG): container finished" podID="ed38e2f2-b350-4abd-abe2-859c9d504aa8" containerID="bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355" exitCode=0 Feb 27 00:32:08 crc kubenswrapper[4781]: I0227 00:32:08.124785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerDied","Data":"bfaee7ec7de3505b4e22cf4499593dc512858eb8d4ea24469079b8a31c14c355"} Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.147455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ed38e2f2-b350-4abd-abe2-859c9d504aa8","Type":"ContainerStarted","Data":"02dd84684c9a248bca28815a43bafa3423f6e8c22db55a548880cac1191bbca2"} Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.148212 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 00:32:09 crc kubenswrapper[4781]: I0227 00:32:09.171285 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.171269678 podStartE2EDuration="36.171269678s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:32:09.170298881 +0000 UTC m=+1598.427838435" watchObservedRunningTime="2026-02-27 00:32:09.171269678 +0000 UTC m=+1598.428809232" Feb 27 00:32:10 crc kubenswrapper[4781]: I0227 00:32:10.160617 4781 generic.go:334] "Generic (PLEG): container finished" podID="37519387-1738-4500-9953-52deba3e4a85" containerID="35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9" exitCode=0 Feb 27 00:32:10 crc kubenswrapper[4781]: I0227 00:32:10.160686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerDied","Data":"35670c776bc05f00a77eed47488c259dc1a1c6ce2969f6d1ea6d21ba78546cf9"} Feb 27 00:32:11 crc kubenswrapper[4781]: E0227 00:32:11.223773 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:12 crc kubenswrapper[4781]: I0227 00:32:12.110244 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.055860 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:15 crc kubenswrapper[4781]: E0227 00:32:15.057242 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.057266 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.057678 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ad6440-a4bb-43a6-985a-42979a799437" containerName="oc" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.060558 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.066582 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.244698 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.346605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.347176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.347258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.348306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.348664 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.367107 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"community-operators-d8cst\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.389508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:15 crc kubenswrapper[4781]: W0227 00:32:15.983832 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b5a0c6_df06_4a8e_8f22_e17d79c0dcb2.slice/crio-ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755 WatchSource:0}: Error finding container ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755: Status 404 returned error can't find the container with id ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755 Feb 27 00:32:15 crc kubenswrapper[4781]: I0227 00:32:15.987864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.231134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37519387-1738-4500-9953-52deba3e4a85","Type":"ContainerStarted","Data":"0048ebb6ce6c868afaed9d5bc7916d5f81a79b806ca8eaad5b59b8285b42b235"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.231358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.235052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerStarted","Data":"f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.236512 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755"} Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.281692 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.281672083 podStartE2EDuration="43.281672083s" podCreationTimestamp="2026-02-27 00:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 00:32:16.256937785 +0000 UTC m=+1605.514477339" watchObservedRunningTime="2026-02-27 00:32:16.281672083 +0000 UTC m=+1605.539211647" Feb 27 00:32:16 crc kubenswrapper[4781]: I0227 00:32:16.286042 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" podStartSLOduration=2.499810275 podStartE2EDuration="12.286033691s" podCreationTimestamp="2026-02-27 00:32:04 +0000 UTC" firstStartedPulling="2026-02-27 00:32:05.567188459 +0000 UTC m=+1594.824728003" lastFinishedPulling="2026-02-27 00:32:15.353411865 +0000 UTC m=+1604.610951419" observedRunningTime="2026-02-27 00:32:16.27820872 +0000 UTC m=+1605.535748274" watchObservedRunningTime="2026-02-27 00:32:16.286033691 +0000 UTC m=+1605.543573245" Feb 27 00:32:17 crc kubenswrapper[4781]: I0227 00:32:17.247396 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" exitCode=0 Feb 27 00:32:17 crc kubenswrapper[4781]: I0227 00:32:17.247513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9"} Feb 27 00:32:19 crc kubenswrapper[4781]: I0227 00:32:19.272200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} Feb 27 00:32:21 crc kubenswrapper[4781]: I0227 00:32:21.302153 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" exitCode=0 Feb 27 00:32:21 crc kubenswrapper[4781]: I0227 00:32:21.302417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} Feb 27 00:32:21 crc kubenswrapper[4781]: E0227 00:32:21.517260 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee23b33_5d55_45c9_b024_0b4865019095.slice/crio-c1cf39a6e1aaa1fdbc695699fd6efe141913102901ca2317e8b825da4d37a1de\": RecentStats: unable to find data in memory cache]" Feb 27 00:32:22 crc kubenswrapper[4781]: I0227 00:32:22.322121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerStarted","Data":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} Feb 27 00:32:22 crc kubenswrapper[4781]: I0227 00:32:22.343639 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d8cst" podStartSLOduration=2.904209517 podStartE2EDuration="7.343604513s" podCreationTimestamp="2026-02-27 00:32:15 +0000 UTC" firstStartedPulling="2026-02-27 00:32:17.249011617 +0000 UTC m=+1606.506551171" lastFinishedPulling="2026-02-27 00:32:21.688406603 +0000 UTC m=+1610.945946167" observedRunningTime="2026-02-27 00:32:22.341495326 +0000 UTC m=+1611.599034890" watchObservedRunningTime="2026-02-27 00:32:22.343604513 +0000 UTC m=+1611.601144077" Feb 27 00:32:23 crc kubenswrapper[4781]: I0227 00:32:23.553100 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.389736 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.390309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:25 crc kubenswrapper[4781]: I0227 00:32:25.442789 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.460752 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.510265 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:26 crc kubenswrapper[4781]: I0227 00:32:26.975434 4781 scope.go:117] "RemoveContainer" containerID="e2bf980506549d387ee967a300bd50ff50a9e4489a44bcc2c952a5e2c00137a5" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.020246 4781 scope.go:117] "RemoveContainer" containerID="3d01f4c64b31dda5359f791eed0af9accdc107437765895fcc3cd585df0f55ae" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.047707 4781 scope.go:117] "RemoveContainer" containerID="c6e860c6c62b63e5a5fe835a4877c45040a36e7fc332cce5af395a3eaa5e24b1" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.115380 4781 scope.go:117] "RemoveContainer" containerID="a458867b742ce8b5b3fdd2c97ebf1845a6845fd00e046dd893821ec44de7237b" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.145181 4781 scope.go:117] "RemoveContainer" containerID="beeaff089c6577afca77da55c908132f8c47a3993cf1d2011eea873db182b172" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.198076 4781 scope.go:117] "RemoveContainer" containerID="31cd21a634eff04c79df7b5ee8d37fc4cdb1a4b5a72c57fc0d9aca1961c28780" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.240359 4781 scope.go:117] "RemoveContainer" containerID="0d295c8666e863d2c0e4e0d3a3e33356c58f61c54e944f8ced4d911133124bc0" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.294248 4781 scope.go:117] "RemoveContainer" containerID="490f54d4fc0654da6b5add2d9e470584271088a4fc9d0ff0972339bc97ab6f8f" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.330220 4781 scope.go:117] "RemoveContainer" containerID="28555d58f1fd114e239212917d6df64a83d89ed63bf1f65157974daf4ae101b8" Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.381991 4781 generic.go:334] "Generic (PLEG): container finished" podID="05795337-1929-47d6-b63f-96d078b66c47" containerID="f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1" exitCode=0 Feb 27 00:32:27 crc kubenswrapper[4781]: I0227 00:32:27.382051 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerDied","Data":"f0acd75c80c39bafdd1bd55a70eff436e62d8c05f625f9a638a4cea0a03b81f1"} Feb 27 00:32:28 crc kubenswrapper[4781]: I0227 00:32:28.401146 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d8cst" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" containerID="cri-o://e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" gracePeriod=2 Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.054957 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.063967 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181109 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181237 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181261 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181313 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181417 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") pod \"05795337-1929-47d6-b63f-96d078b66c47\" (UID: \"05795337-1929-47d6-b63f-96d078b66c47\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.181550 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") pod \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\" (UID: \"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2\") " Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.182330 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities" (OuterVolumeSpecName: "utilities") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.190517 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm" (OuterVolumeSpecName: "kube-api-access-hkhnm") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "kube-api-access-hkhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.190894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb" (OuterVolumeSpecName: "kube-api-access-gjdrb") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "kube-api-access-gjdrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.194807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.217352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory" (OuterVolumeSpecName: "inventory") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.237823 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05795337-1929-47d6-b63f-96d078b66c47" (UID: "05795337-1929-47d6-b63f-96d078b66c47"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.239730 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" (UID: "48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283900 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkhnm\" (UniqueName: \"kubernetes.io/projected/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-kube-api-access-hkhnm\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283948 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283969 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.283990 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05795337-1929-47d6-b63f-96d078b66c47-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284008 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284026 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.284046 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjdrb\" (UniqueName: \"kubernetes.io/projected/05795337-1929-47d6-b63f-96d078b66c47-kube-api-access-gjdrb\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" event={"ID":"05795337-1929-47d6-b63f-96d078b66c47","Type":"ContainerDied","Data":"758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414894 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758121309d739274e74f53a22e843b3e32bd2d388e94666e571e5c3e7026bff2" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.414946 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.419970 4781 generic.go:334] "Generic (PLEG): container finished" podID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" exitCode=0 Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420022 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8cst" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420030 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8cst" event={"ID":"48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2","Type":"ContainerDied","Data":"ef28bee2e1ea85e9824d7aa816578f65afc6d5968cc3a2d777e02d92ca74b755"} Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.420113 4781 scope.go:117] "RemoveContainer" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.445923 4781 scope.go:117] "RemoveContainer" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.468536 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.483592 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d8cst"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.484272 4781 scope.go:117] "RemoveContainer" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497086 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-utilities" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497830 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-utilities" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497857 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497864 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497874 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497882 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.497903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-content" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.497909 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="extract-content" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498108 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" containerName="registry-server" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498125 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="05795337-1929-47d6-b63f-96d078b66c47" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.498890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.501688 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.501979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.503388 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.509822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.510226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.517440 4781 scope.go:117] "RemoveContainer" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.519598 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": container with ID starting with e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f not found: ID does not exist" containerID="e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.519758 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f"} err="failed to get container status \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": rpc error: code = NotFound desc = could not find container \"e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f\": container with ID starting with e291d7b6b611f0e90169fb389462db62593987ace90b0ea5f4e7d72221237c1f not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.519792 4781 scope.go:117] "RemoveContainer" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.520254 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": container with ID starting with 48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49 not found: ID does not exist" containerID="48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520306 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49"} err="failed to get container status \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": rpc error: code = NotFound desc = could not find container \"48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49\": container with ID starting with 48613f9189568b464f9572351f831ed191b64b7214e1553ad723388f40b7ce49 not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520339 4781 scope.go:117] "RemoveContainer" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: E0227 00:32:29.520688 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": container with ID starting with 0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9 not found: ID does not exist" containerID="0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.520719 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9"} err="failed to get container status \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": rpc error: code = NotFound desc = could not find container \"0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9\": container with ID starting with 0da7476083cfc6517f7fd255b8c1f83f83ab7f4385bca6594fc902612e5f98c9 not found: ID does not exist" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.692734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.693066 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.693226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.794983 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.795071 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.795116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.800180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.801452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.813493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4tds4\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:29 crc kubenswrapper[4781]: I0227 00:32:29.879284 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:30 crc kubenswrapper[4781]: I0227 00:32:30.599411 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4"] Feb 27 00:32:30 crc kubenswrapper[4781]: W0227 00:32:30.607762 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca27d369_00b1_47ec_88cc_87d4a7065356.slice/crio-d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80 WatchSource:0}: Error finding container d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80: Status 404 returned error can't find the container with id d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80 Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.325835 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2" path="/var/lib/kubelet/pods/48b5a0c6-df06-4a8e-8f22-e17d79c0dcb2/volumes" Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.440493 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerStarted","Data":"7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1"} Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.440543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerStarted","Data":"d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80"} Feb 27 00:32:31 crc kubenswrapper[4781]: I0227 00:32:31.460272 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" podStartSLOduration=2.03062625 podStartE2EDuration="2.460251247s" podCreationTimestamp="2026-02-27 00:32:29 +0000 UTC" firstStartedPulling="2026-02-27 00:32:30.612162183 +0000 UTC m=+1619.869701747" lastFinishedPulling="2026-02-27 00:32:31.0417872 +0000 UTC m=+1620.299326744" observedRunningTime="2026-02-27 00:32:31.456382777 +0000 UTC m=+1620.713922341" watchObservedRunningTime="2026-02-27 00:32:31.460251247 +0000 UTC m=+1620.717790801" Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.475702 4781 generic.go:334] "Generic (PLEG): container finished" podID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerID="7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1" exitCode=0 Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.475814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerDied","Data":"7ffe4cc8d82e6022d765751060e689bfec6ab82af27a06b7cad02fcc0dcc8cb1"} Feb 27 00:32:34 crc kubenswrapper[4781]: I0227 00:32:34.494768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.056642 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.197795 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.198670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.198764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") pod \"ca27d369-00b1-47ec-88cc-87d4a7065356\" (UID: \"ca27d369-00b1-47ec-88cc-87d4a7065356\") " Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.209917 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj" (OuterVolumeSpecName: "kube-api-access-q64vj") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "kube-api-access-q64vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.233825 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory" (OuterVolumeSpecName: "inventory") pod "ca27d369-00b1-47ec-88cc-87d4a7065356" (UID: "ca27d369-00b1-47ec-88cc-87d4a7065356"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300894 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q64vj\" (UniqueName: \"kubernetes.io/projected/ca27d369-00b1-47ec-88cc-87d4a7065356-kube-api-access-q64vj\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.300912 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.402891 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca27d369-00b1-47ec-88cc-87d4a7065356-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.499920 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" event={"ID":"ca27d369-00b1-47ec-88cc-87d4a7065356","Type":"ContainerDied","Data":"d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80"} Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.500206 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04845bd198b474e4b0be70d06fbe5733ebe209b4679dc9cc937514a68b44e80" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.499989 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4tds4" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.609814 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:36 crc kubenswrapper[4781]: E0227 00:32:36.610374 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.610398 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.610709 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca27d369-00b1-47ec-88cc-87d4a7065356" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.611688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.613668 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614498 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614638 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.614751 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.619740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.708659 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.708942 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.709020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.709320 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811169 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.811238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.815587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.837171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:36 crc kubenswrapper[4781]: I0227 00:32:36.972797 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:32:37 crc kubenswrapper[4781]: I0227 00:32:37.511221 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp"] Feb 27 00:32:37 crc kubenswrapper[4781]: W0227 00:32:37.512552 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c301c2_f624_44a1_ad01_7d60748c5fca.slice/crio-0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f WatchSource:0}: Error finding container 0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f: Status 404 returned error can't find the container with id 0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f Feb 27 00:32:38 crc kubenswrapper[4781]: I0227 00:32:38.523292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerStarted","Data":"0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f"} Feb 27 00:32:39 crc kubenswrapper[4781]: I0227 00:32:39.535276 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerStarted","Data":"c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408"} Feb 27 00:32:39 crc kubenswrapper[4781]: I0227 00:32:39.553980 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" podStartSLOduration=2.432097821 podStartE2EDuration="3.553960061s" podCreationTimestamp="2026-02-27 00:32:36 +0000 UTC" firstStartedPulling="2026-02-27 00:32:37.516010923 +0000 UTC m=+1626.773550487" lastFinishedPulling="2026-02-27 00:32:38.637873163 +0000 UTC m=+1627.895412727" observedRunningTime="2026-02-27 00:32:39.548308234 +0000 UTC m=+1628.805847818" watchObservedRunningTime="2026-02-27 00:32:39.553960061 +0000 UTC m=+1628.811499615" Feb 27 00:33:27 crc kubenswrapper[4781]: I0227 00:33:27.642172 4781 scope.go:117] "RemoveContainer" containerID="914d10b311f6e761cfe3376de0d9169e16d04822bd5c0495a9b64cbbe456b1f4" Feb 27 00:33:27 crc kubenswrapper[4781]: I0227 00:33:27.693409 4781 scope.go:117] "RemoveContainer" containerID="da1dbeb22d52f0e9e8028b046b421ef782d44fa0719cff0b4421d346eb2fd5aa" Feb 27 00:33:42 crc kubenswrapper[4781]: I0227 00:33:42.895512 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:33:42 crc kubenswrapper[4781]: I0227 00:33:42.896870 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.166481 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.169667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.175838 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.176067 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.176253 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.180372 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.277166 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.380507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.412986 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"auto-csr-approver-29535874-9b4fw\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:00 crc kubenswrapper[4781]: I0227 00:34:00.507037 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:01 crc kubenswrapper[4781]: I0227 00:34:01.036909 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:34:01 crc kubenswrapper[4781]: W0227 00:34:01.043729 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21bdad75_a7e5_4940_9ee3_be513a55b97d.slice/crio-c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17 WatchSource:0}: Error finding container c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17: Status 404 returned error can't find the container with id c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17 Feb 27 00:34:01 crc kubenswrapper[4781]: I0227 00:34:01.587943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerStarted","Data":"c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17"} Feb 27 00:34:02 crc kubenswrapper[4781]: I0227 00:34:02.598895 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerStarted","Data":"172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d"} Feb 27 00:34:02 crc kubenswrapper[4781]: I0227 00:34:02.624525 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" podStartSLOduration=1.634584515 podStartE2EDuration="2.624498816s" podCreationTimestamp="2026-02-27 00:34:00 +0000 UTC" firstStartedPulling="2026-02-27 00:34:01.058497073 +0000 UTC m=+1710.316036627" lastFinishedPulling="2026-02-27 00:34:02.048411354 +0000 UTC m=+1711.305950928" observedRunningTime="2026-02-27 00:34:02.617007552 +0000 UTC m=+1711.874547106" watchObservedRunningTime="2026-02-27 00:34:02.624498816 +0000 UTC m=+1711.882038400" Feb 27 00:34:03 crc kubenswrapper[4781]: I0227 00:34:03.624424 4781 generic.go:334] "Generic (PLEG): container finished" podID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerID="172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d" exitCode=0 Feb 27 00:34:03 crc kubenswrapper[4781]: I0227 00:34:03.624814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerDied","Data":"172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d"} Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.104004 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.296462 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") pod \"21bdad75-a7e5-4940-9ee3-be513a55b97d\" (UID: \"21bdad75-a7e5-4940-9ee3-be513a55b97d\") " Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.302264 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9" (OuterVolumeSpecName: "kube-api-access-5szk9") pod "21bdad75-a7e5-4940-9ee3-be513a55b97d" (UID: "21bdad75-a7e5-4940-9ee3-be513a55b97d"). InnerVolumeSpecName "kube-api-access-5szk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.398567 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5szk9\" (UniqueName: \"kubernetes.io/projected/21bdad75-a7e5-4940-9ee3-be513a55b97d-kube-api-access-5szk9\") on node \"crc\" DevicePath \"\"" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" event={"ID":"21bdad75-a7e5-4940-9ee3-be513a55b97d","Type":"ContainerDied","Data":"c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17"} Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654181 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c1fc30ddbbac44afe2638670904d909d0b2d5a07a5af4aea258b07d7edab17" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.654267 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535874-9b4fw" Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.719839 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:34:05 crc kubenswrapper[4781]: I0227 00:34:05.734023 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535868-f5csp"] Feb 27 00:34:07 crc kubenswrapper[4781]: I0227 00:34:07.327480 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3df72f1-7ac9-4877-a7b4-a17b5c724303" path="/var/lib/kubelet/pods/f3df72f1-7ac9-4877-a7b4-a17b5c724303/volumes" Feb 27 00:34:12 crc kubenswrapper[4781]: I0227 00:34:12.895979 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:34:12 crc kubenswrapper[4781]: I0227 00:34:12.896546 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.809392 4781 scope.go:117] "RemoveContainer" containerID="5c6246746a3c78078a59adb64a2979be72d82f5cfd95c152a4db993cadaf1efe" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.842591 4781 scope.go:117] "RemoveContainer" containerID="8eb943556508c5cc9103fa044300406224b9b4973d8e501d8f7538f1c3573e24" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.914910 4781 scope.go:117] "RemoveContainer" containerID="6964fd56259850480217527d40244a043795966342292bb5a943a33534e5489f" Feb 27 00:34:27 crc kubenswrapper[4781]: I0227 00:34:27.957379 4781 scope.go:117] "RemoveContainer" containerID="58983f3a0d32568b0a106e31b532196dd7e3e78ec29a99f5dc4c44649ec4e605" Feb 27 00:34:28 crc kubenswrapper[4781]: I0227 00:34:28.008415 4781 scope.go:117] "RemoveContainer" containerID="7aaaa3159dfec72ce2bfd72718ace0516b0de685b4c75d813a19d16d4226019b" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.895987 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.897085 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.897172 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.898677 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:34:42 crc kubenswrapper[4781]: I0227 00:34:42.898754 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" gracePeriod=600 Feb 27 00:34:43 crc kubenswrapper[4781]: E0227 00:34:43.029954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101681 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" exitCode=0 Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f"} Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.101775 4781 scope.go:117] "RemoveContainer" containerID="18f81d6f38ae3802e83160171263bed0ca095345d87ab2807429711c0c761818" Feb 27 00:34:43 crc kubenswrapper[4781]: I0227 00:34:43.102755 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:34:43 crc kubenswrapper[4781]: E0227 00:34:43.103079 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:34:56 crc kubenswrapper[4781]: I0227 00:34:56.309655 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:34:56 crc kubenswrapper[4781]: E0227 00:34:56.310400 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:05 crc kubenswrapper[4781]: E0227 00:35:05.935752 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 27 00:35:07 crc kubenswrapper[4781]: I0227 00:35:07.309855 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:07 crc kubenswrapper[4781]: E0227 00:35:07.310366 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:18 crc kubenswrapper[4781]: I0227 00:35:18.309915 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:18 crc kubenswrapper[4781]: E0227 00:35:18.310864 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.161766 4781 scope.go:117] "RemoveContainer" containerID="7a5345b65b014bc9d0e2cd844013d91d1a91d4e408c41f0c7f4f964de80130f6" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.195808 4781 scope.go:117] "RemoveContainer" containerID="cfcdb38663d80d12b7e86a05dfe2ce7cc23ff17e6af4e336ba2f0e4a180806c3" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.223319 4781 scope.go:117] "RemoveContainer" containerID="16ba8a242e20589655027929d1c82fa25c3d9fc988018051237357efea8a8ec9" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.247394 4781 scope.go:117] "RemoveContainer" containerID="9155e1f68a6370d2a59d952aff96914080df4756a62f18bb9bbc3ec507e49ef4" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.266200 4781 scope.go:117] "RemoveContainer" containerID="7cb922ac2fcfd76994a7254d975044d1fe0a7563db3547acc86bfb78f94c47a2" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.289918 4781 scope.go:117] "RemoveContainer" containerID="bfa97c01ece2e8cbadd8eda7e12994d67d495e411ba60ed25dc9b412019a8f03" Feb 27 00:35:28 crc kubenswrapper[4781]: I0227 00:35:28.319057 4781 scope.go:117] "RemoveContainer" containerID="be2fe215086cd4058aea52c301ed09e04ac3143d7e54d38772b785701e47e5f8" Feb 27 00:35:29 crc kubenswrapper[4781]: I0227 00:35:29.309685 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:29 crc kubenswrapper[4781]: E0227 00:35:29.310110 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:32 crc kubenswrapper[4781]: I0227 00:35:32.661084 4781 generic.go:334] "Generic (PLEG): container finished" podID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerID="c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408" exitCode=0 Feb 27 00:35:32 crc kubenswrapper[4781]: I0227 00:35:32.661161 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerDied","Data":"c9f410e8ea0a201af9b55d546472dc35416b2a24d8046632364c67fede87b408"} Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.188096 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338041 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338419 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.338469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") pod \"94c301c2-f624-44a1-ad01-7d60748c5fca\" (UID: \"94c301c2-f624-44a1-ad01-7d60748c5fca\") " Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.345144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv" (OuterVolumeSpecName: "kube-api-access-7f9qv") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "kube-api-access-7f9qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.349173 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.375136 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.378779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory" (OuterVolumeSpecName: "inventory") pod "94c301c2-f624-44a1-ad01-7d60748c5fca" (UID: "94c301c2-f624-44a1-ad01-7d60748c5fca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440890 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9qv\" (UniqueName: \"kubernetes.io/projected/94c301c2-f624-44a1-ad01-7d60748c5fca-kube-api-access-7f9qv\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440930 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440943 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.440954 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c301c2-f624-44a1-ad01-7d60748c5fca-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" event={"ID":"94c301c2-f624-44a1-ad01-7d60748c5fca","Type":"ContainerDied","Data":"0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f"} Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682423 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f554d217df45e166b4e450e9197072feb57a19920193828d93e3c4d8a9aab0f" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.682501 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807086 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:34 crc kubenswrapper[4781]: E0227 00:35:34.807599 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807619 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: E0227 00:35:34.807660 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807668 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807904 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c301c2-f624-44a1-ad01-7d60748c5fca" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.807921 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" containerName="oc" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.808699 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.826618 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827261 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.827552 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.828177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.983804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.984124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:34 crc kubenswrapper[4781]: I0227 00:35:34.984259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.085997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.086568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.086845 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.089753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.091918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.103028 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.137893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:35:35 crc kubenswrapper[4781]: I0227 00:35:35.767033 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs"] Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.706748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerStarted","Data":"2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5"} Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.707466 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerStarted","Data":"3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b"} Feb 27 00:35:36 crc kubenswrapper[4781]: I0227 00:35:36.746908 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" podStartSLOduration=2.335755388 podStartE2EDuration="2.746885645s" podCreationTimestamp="2026-02-27 00:35:34 +0000 UTC" firstStartedPulling="2026-02-27 00:35:35.772004435 +0000 UTC m=+1805.029543989" lastFinishedPulling="2026-02-27 00:35:36.183134692 +0000 UTC m=+1805.440674246" observedRunningTime="2026-02-27 00:35:36.720915502 +0000 UTC m=+1805.978455066" watchObservedRunningTime="2026-02-27 00:35:36.746885645 +0000 UTC m=+1806.004425199" Feb 27 00:35:41 crc kubenswrapper[4781]: I0227 00:35:41.319682 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:41 crc kubenswrapper[4781]: E0227 00:35:41.320550 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:35:55 crc kubenswrapper[4781]: I0227 00:35:55.310084 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:35:55 crc kubenswrapper[4781]: E0227 00:35:55.310954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.163254 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.165450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.167250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.167752 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.173357 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.188765 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.273817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.375936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.394405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"auto-csr-approver-29535876-2l88l\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.484970 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:00 crc kubenswrapper[4781]: W0227 00:36:00.980946 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9301966_9820_4623_8393_f185a0616743.slice/crio-2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95 WatchSource:0}: Error finding container 2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95: Status 404 returned error can't find the container with id 2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95 Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.991815 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:36:00 crc kubenswrapper[4781]: I0227 00:36:00.998280 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerStarted","Data":"2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95"} Feb 27 00:36:03 crc kubenswrapper[4781]: I0227 00:36:03.020726 4781 generic.go:334] "Generic (PLEG): container finished" podID="f9301966-9820-4623-8393-f185a0616743" containerID="53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5" exitCode=0 Feb 27 00:36:03 crc kubenswrapper[4781]: I0227 00:36:03.020830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerDied","Data":"53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5"} Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.532287 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.681819 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") pod \"f9301966-9820-4623-8393-f185a0616743\" (UID: \"f9301966-9820-4623-8393-f185a0616743\") " Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.688066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7" (OuterVolumeSpecName: "kube-api-access-rwns7") pod "f9301966-9820-4623-8393-f185a0616743" (UID: "f9301966-9820-4623-8393-f185a0616743"). InnerVolumeSpecName "kube-api-access-rwns7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:36:04 crc kubenswrapper[4781]: I0227 00:36:04.784965 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwns7\" (UniqueName: \"kubernetes.io/projected/f9301966-9820-4623-8393-f185a0616743-kube-api-access-rwns7\") on node \"crc\" DevicePath \"\"" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535876-2l88l" event={"ID":"f9301966-9820-4623-8393-f185a0616743","Type":"ContainerDied","Data":"2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95"} Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046700 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad68abae152e8f397d403a5bdc5e8e0dcddc147c700af7f42bfe61626d6ee95" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.046376 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535876-2l88l" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.102166 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.116172 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.128180 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.136823 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.146745 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-94sd2"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.155761 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.164802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5255-account-create-update-k87hd"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.174151 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jrxqx"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.183579 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-547a-account-create-update-tf2pb"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.196791 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f66vm"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.333361 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cec0cd3-abcd-484c-85b8-03a44888a9b7" path="/var/lib/kubelet/pods/0cec0cd3-abcd-484c-85b8-03a44888a9b7/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.334034 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6016e5-2641-4b82-b164-121ae822f863" path="/var/lib/kubelet/pods/2c6016e5-2641-4b82-b164-121ae822f863/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.335185 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdd8664-6d91-4616-8095-f44067fdca51" path="/var/lib/kubelet/pods/6bdd8664-6d91-4616-8095-f44067fdca51/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.336399 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90ad80e-9897-4e20-b9b0-6add43c84bd0" path="/var/lib/kubelet/pods/c90ad80e-9897-4e20-b9b0-6add43c84bd0/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.337400 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1713962-9458-45b2-9f28-61409b7ff581" path="/var/lib/kubelet/pods/f1713962-9458-45b2-9f28-61409b7ff581/volumes" Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.605347 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:36:05 crc kubenswrapper[4781]: I0227 00:36:05.623273 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535870-pjv2c"] Feb 27 00:36:06 crc kubenswrapper[4781]: I0227 00:36:06.043895 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:36:06 crc kubenswrapper[4781]: I0227 00:36:06.055477 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8c9b-account-create-update-d29bm"] Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.309410 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:07 crc kubenswrapper[4781]: E0227 00:36:07.310871 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.322780 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ec74af-d604-42ac-83bb-db047e8d8506" path="/var/lib/kubelet/pods/b8ec74af-d604-42ac-83bb-db047e8d8506/volumes" Feb 27 00:36:07 crc kubenswrapper[4781]: I0227 00:36:07.324392 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4687ec-812e-48bb-8d53-ed628f3cd013" path="/var/lib/kubelet/pods/bb4687ec-812e-48bb-8d53-ed628f3cd013/volumes" Feb 27 00:36:20 crc kubenswrapper[4781]: I0227 00:36:20.309408 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:20 crc kubenswrapper[4781]: E0227 00:36:20.310168 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.433952 4781 scope.go:117] "RemoveContainer" containerID="a52130091c9100982624c568d31dc83849096589647f47661d0debdea301a332" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.467145 4781 scope.go:117] "RemoveContainer" containerID="b5253e8bb3200baca59ed8e598dc74eaddbc9fc4ea687d121523ff8347b4d62e" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.533333 4781 scope.go:117] "RemoveContainer" containerID="2520db6bdce6e0291f097369119b25f716226e74f321fc28345a81a9140017c8" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.612517 4781 scope.go:117] "RemoveContainer" containerID="74853e0dfa3329c0157368e93fb3d1251b7149a8041ea7981936c9bd91076b44" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.665617 4781 scope.go:117] "RemoveContainer" containerID="4d55d2c6e343b6a1d3b8b47dac42837612db67ccce352ab276d326d2b146954e" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.731201 4781 scope.go:117] "RemoveContainer" containerID="8fb72d9409a124bb8fa0479e75bf3cf0cd120b3aae8696f10bef9465f2261fc6" Feb 27 00:36:28 crc kubenswrapper[4781]: I0227 00:36:28.810381 4781 scope.go:117] "RemoveContainer" containerID="b01d66bc253f93ef989863fc6fd69c5afb4405a98783d9e32be4f4b80ce3df36" Feb 27 00:36:31 crc kubenswrapper[4781]: I0227 00:36:31.319048 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:31 crc kubenswrapper[4781]: E0227 00:36:31.319721 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.050203 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.061310 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.073028 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.085377 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.097588 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.110948 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.120443 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.129521 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4bde-account-create-update-fpg2t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.138101 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-a05d-account-create-update-cw8zv"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.147028 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-m5rm5"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.157583 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a8a8-account-create-update-vcwwx"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.166268 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-99xdp"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.176058 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6e38-account-create-update-dntk2"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.184659 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wxsbg"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.194012 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.203086 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zvn4t"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.211974 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:36:32 crc kubenswrapper[4781]: I0227 00:36:32.222025 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-v2g9n"] Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.344269 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb55288-e9bb-46f0-bae3-789e8db036cf" path="/var/lib/kubelet/pods/0eb55288-e9bb-46f0-bae3-789e8db036cf/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.349813 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24adb929-f812-4243-94ea-23345856d28f" path="/var/lib/kubelet/pods/24adb929-f812-4243-94ea-23345856d28f/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.351290 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388be198-b438-4142-8fb8-ec9831e9a1af" path="/var/lib/kubelet/pods/388be198-b438-4142-8fb8-ec9831e9a1af/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.351884 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae26ad0-3770-4153-a1d6-96ae3a9e36a9" path="/var/lib/kubelet/pods/3ae26ad0-3770-4153-a1d6-96ae3a9e36a9/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.353336 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9cc074-4ea1-4c04-9398-5be68fbcd5cf" path="/var/lib/kubelet/pods/5b9cc074-4ea1-4c04-9398-5be68fbcd5cf/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.354825 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6344c1fe-eecb-4d57-a5c7-a857e4466439" path="/var/lib/kubelet/pods/6344c1fe-eecb-4d57-a5c7-a857e4466439/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.356760 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafd294d-e929-4cd5-8be3-7175ad4aed09" path="/var/lib/kubelet/pods/cafd294d-e929-4cd5-8be3-7175ad4aed09/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.357471 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3aedfe4-2bbb-46c9-97d4-8d6782c44707" path="/var/lib/kubelet/pods/e3aedfe4-2bbb-46c9-97d4-8d6782c44707/volumes" Feb 27 00:36:33 crc kubenswrapper[4781]: I0227 00:36:33.359457 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8806487-486f-464d-8249-b6368daabff5" path="/var/lib/kubelet/pods/e8806487-486f-464d-8249-b6368daabff5/volumes" Feb 27 00:36:36 crc kubenswrapper[4781]: I0227 00:36:36.032322 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:36:36 crc kubenswrapper[4781]: I0227 00:36:36.040941 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8tmft"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.030768 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.043595 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rs9bx"] Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.321033 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47cc3f01-6a5c-4797-bf86-25770e66e928" path="/var/lib/kubelet/pods/47cc3f01-6a5c-4797-bf86-25770e66e928/volumes" Feb 27 00:36:37 crc kubenswrapper[4781]: I0227 00:36:37.322565 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b577a3-c234-4968-a8e7-c5e629de47b1" path="/var/lib/kubelet/pods/58b577a3-c234-4968-a8e7-c5e629de47b1/volumes" Feb 27 00:36:45 crc kubenswrapper[4781]: I0227 00:36:45.310111 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:45 crc kubenswrapper[4781]: E0227 00:36:45.310973 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:36:56 crc kubenswrapper[4781]: I0227 00:36:56.309149 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:36:56 crc kubenswrapper[4781]: E0227 00:36:56.309948 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:07 crc kubenswrapper[4781]: I0227 00:37:07.727467 4781 generic.go:334] "Generic (PLEG): container finished" podID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerID="2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5" exitCode=0 Feb 27 00:37:07 crc kubenswrapper[4781]: I0227 00:37:07.727541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerDied","Data":"2381181a031f2c79d017f2baa667d5dd32b801106d0766673275724fedbb49d5"} Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.446605 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.580563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.580977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.581175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") pod \"756e2fbc-556d-44b8-8820-e469ae73ff3b\" (UID: \"756e2fbc-556d-44b8-8820-e469ae73ff3b\") " Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.586427 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr" (OuterVolumeSpecName: "kube-api-access-2qgqr") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "kube-api-access-2qgqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.618187 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory" (OuterVolumeSpecName: "inventory") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.620513 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "756e2fbc-556d-44b8-8820-e469ae73ff3b" (UID: "756e2fbc-556d-44b8-8820-e469ae73ff3b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683689 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qgqr\" (UniqueName: \"kubernetes.io/projected/756e2fbc-556d-44b8-8820-e469ae73ff3b-kube-api-access-2qgqr\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683736 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.683751 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/756e2fbc-556d-44b8-8820-e469ae73ff3b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" event={"ID":"756e2fbc-556d-44b8-8820-e469ae73ff3b","Type":"ContainerDied","Data":"3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b"} Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747044 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5e71502ea441afeaae2337c126ac0cba6512e3c3bd88e9bd95ee0b8fb3c58b" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.747300 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839161 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:09 crc kubenswrapper[4781]: E0227 00:37:09.839668 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839692 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: E0227 00:37:09.839725 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839733 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839964 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="756e2fbc-556d-44b8-8820-e469ae73ff3b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.839982 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9301966-9820-4623-8393-f185a0616743" containerName="oc" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.840860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845355 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845581 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.845845 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.847932 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.875724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.887732 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.888132 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.888286 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990685 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990755 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.990907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:09 crc kubenswrapper[4781]: I0227 00:37:09.994880 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.006232 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.007598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.162483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.317536 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:10 crc kubenswrapper[4781]: E0227 00:37:10.318078 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.775612 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg"] Feb 27 00:37:10 crc kubenswrapper[4781]: I0227 00:37:10.779330 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.767446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerStarted","Data":"d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb"} Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.768020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerStarted","Data":"6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543"} Feb 27 00:37:11 crc kubenswrapper[4781]: I0227 00:37:11.803829 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" podStartSLOduration=2.241094592 podStartE2EDuration="2.803808496s" podCreationTimestamp="2026-02-27 00:37:09 +0000 UTC" firstStartedPulling="2026-02-27 00:37:10.779143 +0000 UTC m=+1900.036682554" lastFinishedPulling="2026-02-27 00:37:11.341856904 +0000 UTC m=+1900.599396458" observedRunningTime="2026-02-27 00:37:11.78864456 +0000 UTC m=+1901.046184114" watchObservedRunningTime="2026-02-27 00:37:11.803808496 +0000 UTC m=+1901.061348050" Feb 27 00:37:12 crc kubenswrapper[4781]: I0227 00:37:12.042254 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:37:12 crc kubenswrapper[4781]: I0227 00:37:12.053292 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bk54r"] Feb 27 00:37:13 crc kubenswrapper[4781]: I0227 00:37:13.321872 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f43ab5c-f862-468c-92c1-ec7366eb7ed0" path="/var/lib/kubelet/pods/3f43ab5c-f862-468c-92c1-ec7366eb7ed0/volumes" Feb 27 00:37:24 crc kubenswrapper[4781]: I0227 00:37:24.309113 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:24 crc kubenswrapper[4781]: E0227 00:37:24.310069 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.016520 4781 scope.go:117] "RemoveContainer" containerID="d19d827d09664d0dd3483609af04ecbb9a2549b9335d9da322a84e9180f2130b" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.060549 4781 scope.go:117] "RemoveContainer" containerID="6d76d1e8767f2bf9f86c0f509bcf89309b39540bcf16a94f15017d9639753143" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.141700 4781 scope.go:117] "RemoveContainer" containerID="e75379ab5c604b926c8da8b4e1bc70d938265b4b81cac412dc92c66988d11e4a" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.187843 4781 scope.go:117] "RemoveContainer" containerID="6743d7b0c9868a62aac9ecae7e0ec57bc1eee6923be88c6054b55ea63c96129c" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.227019 4781 scope.go:117] "RemoveContainer" containerID="08f09b8baf0d256e75e4f2cea8a8050728aa867b805093cf4bae153a92736b36" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.273425 4781 scope.go:117] "RemoveContainer" containerID="9fc8ab8561670a45356ed0c0f51ff964f3556019e4a98628e764c0be8c981d4c" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.322841 4781 scope.go:117] "RemoveContainer" containerID="69da9fba4081d0816d2a2271ca344a6097bd067857fe6ffab787c65da0531cbc" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.373344 4781 scope.go:117] "RemoveContainer" containerID="297b6944b15c3822e081c593733409a3c29b72246756946b04eaf97a2a16c5d2" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.396330 4781 scope.go:117] "RemoveContainer" containerID="f6f1fd0f3e8826d700e5044d1fe1b6b827695311ff2f847e95e5ba49a2863393" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.425592 4781 scope.go:117] "RemoveContainer" containerID="9abe8ef3a48995708f20de72923495db036e6761eb107a6dfc8ea5dccc96bf58" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.462970 4781 scope.go:117] "RemoveContainer" containerID="6dace96637328dc4640d3549a1c802cf99efe23b4ad5c291813668a60dc8b49e" Feb 27 00:37:29 crc kubenswrapper[4781]: I0227 00:37:29.484987 4781 scope.go:117] "RemoveContainer" containerID="c1465b73a1df33b94300981b2d1ed1143dd7203d14e97be01d951e1a43d63b4b" Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.050824 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.065573 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jqsnp"] Feb 27 00:37:31 crc kubenswrapper[4781]: I0227 00:37:31.325367 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fa4251-dd48-417b-8002-6df02d3d3dac" path="/var/lib/kubelet/pods/a3fa4251-dd48-417b-8002-6df02d3d3dac/volumes" Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.039607 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.048551 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.057520 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bf4zw"] Feb 27 00:37:32 crc kubenswrapper[4781]: I0227 00:37:32.066789 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gxj6b"] Feb 27 00:37:33 crc kubenswrapper[4781]: I0227 00:37:33.327287 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314ca901-3264-4136-b377-daad0075b72c" path="/var/lib/kubelet/pods/314ca901-3264-4136-b377-daad0075b72c/volumes" Feb 27 00:37:33 crc kubenswrapper[4781]: I0227 00:37:33.328700 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e6d9a1-4cb3-443f-8a81-32d16c4051b1" path="/var/lib/kubelet/pods/b0e6d9a1-4cb3-443f-8a81-32d16c4051b1/volumes" Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.046876 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.059809 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9vlp4"] Feb 27 00:37:35 crc kubenswrapper[4781]: I0227 00:37:35.321294 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef65495-ecb2-4396-bb05-a4c5ee48f291" path="/var/lib/kubelet/pods/aef65495-ecb2-4396-bb05-a4c5ee48f291/volumes" Feb 27 00:37:39 crc kubenswrapper[4781]: I0227 00:37:39.309290 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:39 crc kubenswrapper[4781]: E0227 00:37:39.310224 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.623651 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.626895 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.634826 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.667645 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.668041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.668174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.770267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.771021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.771038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.792543 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"certified-operators-7qb2s\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:44 crc kubenswrapper[4781]: I0227 00:37:44.988091 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:45 crc kubenswrapper[4781]: I0227 00:37:45.488880 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166450 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" exitCode=0 Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884"} Feb 27 00:37:46 crc kubenswrapper[4781]: I0227 00:37:46.166891 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"14c71355d65ecb6c9f56a4511e8798166b28ba0593bf8c51d3f7d5c3c0a96991"} Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.177582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.217973 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.221330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.233990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.331966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434679 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.434869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.435484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.435562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.474367 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"redhat-marketplace-q6qt2\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:47 crc kubenswrapper[4781]: I0227 00:37:47.538428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:48 crc kubenswrapper[4781]: W0227 00:37:48.034610 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55171d71_37a1_422f_8209_3880be373d30.slice/crio-78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265 WatchSource:0}: Error finding container 78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265: Status 404 returned error can't find the container with id 78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265 Feb 27 00:37:48 crc kubenswrapper[4781]: I0227 00:37:48.044074 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:37:48 crc kubenswrapper[4781]: I0227 00:37:48.186605 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265"} Feb 27 00:37:49 crc kubenswrapper[4781]: I0227 00:37:49.200221 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" exitCode=0 Feb 27 00:37:49 crc kubenswrapper[4781]: I0227 00:37:49.200304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8"} Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.211871 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" exitCode=0 Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.211936 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} Feb 27 00:37:50 crc kubenswrapper[4781]: I0227 00:37:50.215699 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} Feb 27 00:37:51 crc kubenswrapper[4781]: I0227 00:37:51.228008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerStarted","Data":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} Feb 27 00:37:51 crc kubenswrapper[4781]: I0227 00:37:51.252118 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7qb2s" podStartSLOduration=2.477681404 podStartE2EDuration="7.252099271s" podCreationTimestamp="2026-02-27 00:37:44 +0000 UTC" firstStartedPulling="2026-02-27 00:37:46.168586913 +0000 UTC m=+1935.426126467" lastFinishedPulling="2026-02-27 00:37:50.94300476 +0000 UTC m=+1940.200544334" observedRunningTime="2026-02-27 00:37:51.248709103 +0000 UTC m=+1940.506248657" watchObservedRunningTime="2026-02-27 00:37:51.252099271 +0000 UTC m=+1940.509638825" Feb 27 00:37:52 crc kubenswrapper[4781]: I0227 00:37:52.239324 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" exitCode=0 Feb 27 00:37:52 crc kubenswrapper[4781]: I0227 00:37:52.239446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.254272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerStarted","Data":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.282576 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q6qt2" podStartSLOduration=2.826381063 podStartE2EDuration="6.282558969s" podCreationTimestamp="2026-02-27 00:37:47 +0000 UTC" firstStartedPulling="2026-02-27 00:37:49.202181934 +0000 UTC m=+1938.459721488" lastFinishedPulling="2026-02-27 00:37:52.65835984 +0000 UTC m=+1941.915899394" observedRunningTime="2026-02-27 00:37:53.273287197 +0000 UTC m=+1942.530826761" watchObservedRunningTime="2026-02-27 00:37:53.282558969 +0000 UTC m=+1942.540098523" Feb 27 00:37:53 crc kubenswrapper[4781]: I0227 00:37:53.310601 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:37:53 crc kubenswrapper[4781]: E0227 00:37:53.311017 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:37:54 crc kubenswrapper[4781]: I0227 00:37:54.988329 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:54 crc kubenswrapper[4781]: I0227 00:37:54.988683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:37:56 crc kubenswrapper[4781]: I0227 00:37:56.037189 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7qb2s" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" probeResult="failure" output=< Feb 27 00:37:56 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:37:56 crc kubenswrapper[4781]: > Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.539018 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.539077 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:57 crc kubenswrapper[4781]: I0227 00:37:57.589747 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:58 crc kubenswrapper[4781]: I0227 00:37:58.359341 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:37:58 crc kubenswrapper[4781]: I0227 00:37:58.415817 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.147643 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.149915 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.152548 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.156899 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.156899 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.160611 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.321840 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.324262 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q6qt2" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" containerID="cri-o://9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" gracePeriod=2 Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.424284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.450129 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"auto-csr-approver-29535878-49z87\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.470363 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.900279 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:38:00 crc kubenswrapper[4781]: I0227 00:38:00.987619 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.037576 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") pod \"55171d71-37a1-422f-8209-3880be373d30\" (UID: \"55171d71-37a1-422f-8209-3880be373d30\") " Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.038443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities" (OuterVolumeSpecName: "utilities") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.044310 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7" (OuterVolumeSpecName: "kube-api-access-7crc7") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "kube-api-access-7crc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.062697 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55171d71-37a1-422f-8209-3880be373d30" (UID: "55171d71-37a1-422f-8209-3880be373d30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140064 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140100 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7crc7\" (UniqueName: \"kubernetes.io/projected/55171d71-37a1-422f-8209-3880be373d30-kube-api-access-7crc7\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.140112 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55171d71-37a1-422f-8209-3880be373d30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.336056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerStarted","Data":"984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.339805 4781 generic.go:334] "Generic (PLEG): container finished" podID="55171d71-37a1-422f-8209-3880be373d30" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" exitCode=0 Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.339861 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6qt2" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340028 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6qt2" event={"ID":"55171d71-37a1-422f-8209-3880be373d30","Type":"ContainerDied","Data":"78514d35d2a0766ea954f03f71c3b3be321ce1ff36f317d754d1cbc3feb05265"} Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.340732 4781 scope.go:117] "RemoveContainer" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.375512 4781 scope.go:117] "RemoveContainer" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.376597 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.389910 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6qt2"] Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.397093 4781 scope.go:117] "RemoveContainer" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422199 4781 scope.go:117] "RemoveContainer" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.422690 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": container with ID starting with 9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6 not found: ID does not exist" containerID="9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422735 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6"} err="failed to get container status \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": rpc error: code = NotFound desc = could not find container \"9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6\": container with ID starting with 9c6bdd76056d19e430394a5ac7c557dd106180c96aedab3d5a3ae0c134c259f6 not found: ID does not exist" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.422764 4781 scope.go:117] "RemoveContainer" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.423054 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": container with ID starting with 3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f not found: ID does not exist" containerID="3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.423142 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f"} err="failed to get container status \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": rpc error: code = NotFound desc = could not find container \"3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f\": container with ID starting with 3856445ba50087ef6c47431c3970aacd43c6057d8ac66e15fc2f5cd78f81733f not found: ID does not exist" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.423163 4781 scope.go:117] "RemoveContainer" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: E0227 00:38:01.424072 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": container with ID starting with 8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8 not found: ID does not exist" containerID="8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8" Feb 27 00:38:01 crc kubenswrapper[4781]: I0227 00:38:01.424130 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8"} err="failed to get container status \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": rpc error: code = NotFound desc = could not find container \"8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8\": container with ID starting with 8bc39157337beac107df06c43884a42b9d4291ea60f8624f07cc67a56837e5e8 not found: ID does not exist" Feb 27 00:38:02 crc kubenswrapper[4781]: I0227 00:38:02.353428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerStarted","Data":"e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e"} Feb 27 00:38:02 crc kubenswrapper[4781]: I0227 00:38:02.369423 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535878-49z87" podStartSLOduration=1.48741801 podStartE2EDuration="2.36939902s" podCreationTimestamp="2026-02-27 00:38:00 +0000 UTC" firstStartedPulling="2026-02-27 00:38:00.997419666 +0000 UTC m=+1950.254959210" lastFinishedPulling="2026-02-27 00:38:01.879400666 +0000 UTC m=+1951.136940220" observedRunningTime="2026-02-27 00:38:02.367236613 +0000 UTC m=+1951.624776167" watchObservedRunningTime="2026-02-27 00:38:02.36939902 +0000 UTC m=+1951.626938584" Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.323970 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55171d71-37a1-422f-8209-3880be373d30" path="/var/lib/kubelet/pods/55171d71-37a1-422f-8209-3880be373d30/volumes" Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.365881 4781 generic.go:334] "Generic (PLEG): container finished" podID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerID="e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e" exitCode=0 Feb 27 00:38:03 crc kubenswrapper[4781]: I0227 00:38:03.365940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerDied","Data":"e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e"} Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.309977 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:04 crc kubenswrapper[4781]: E0227 00:38:04.310276 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.832775 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.921898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") pod \"b63206fe-04b3-4f07-a4cb-f8fd89645931\" (UID: \"b63206fe-04b3-4f07-a4cb-f8fd89645931\") " Feb 27 00:38:04 crc kubenswrapper[4781]: I0227 00:38:04.928334 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj" (OuterVolumeSpecName: "kube-api-access-m4hqj") pod "b63206fe-04b3-4f07-a4cb-f8fd89645931" (UID: "b63206fe-04b3-4f07-a4cb-f8fd89645931"). InnerVolumeSpecName "kube-api-access-m4hqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.024462 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hqj\" (UniqueName: \"kubernetes.io/projected/b63206fe-04b3-4f07-a4cb-f8fd89645931-kube-api-access-m4hqj\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.034309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.080249 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.268276 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.388941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535878-49z87" event={"ID":"b63206fe-04b3-4f07-a4cb-f8fd89645931","Type":"ContainerDied","Data":"984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5"} Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.389465 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="984293dfb6eabf90a79acc02e28249f4b01ff5cb7665ca85189795184da119f5" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.388972 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535878-49z87" Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.445933 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:38:05 crc kubenswrapper[4781]: I0227 00:38:05.455216 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535872-fpvhr"] Feb 27 00:38:06 crc kubenswrapper[4781]: I0227 00:38:06.397576 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7qb2s" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" containerID="cri-o://dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" gracePeriod=2 Feb 27 00:38:06 crc kubenswrapper[4781]: I0227 00:38:06.948991 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.066723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") pod \"711e5f04-7574-4aae-921b-84beb876849f\" (UID: \"711e5f04-7574-4aae-921b-84beb876849f\") " Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.068312 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities" (OuterVolumeSpecName: "utilities") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.090839 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf" (OuterVolumeSpecName: "kube-api-access-gfghf") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "kube-api-access-gfghf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.136821 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "711e5f04-7574-4aae-921b-84beb876849f" (UID: "711e5f04-7574-4aae-921b-84beb876849f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169030 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfghf\" (UniqueName: \"kubernetes.io/projected/711e5f04-7574-4aae-921b-84beb876849f-kube-api-access-gfghf\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169057 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.169090 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711e5f04-7574-4aae-921b-84beb876849f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.321479 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ad6440-a4bb-43a6-985a-42979a799437" path="/var/lib/kubelet/pods/28ad6440-a4bb-43a6-985a-42979a799437/volumes" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409416 4781 generic.go:334] "Generic (PLEG): container finished" podID="711e5f04-7574-4aae-921b-84beb876849f" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" exitCode=0 Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409471 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7qb2s" event={"ID":"711e5f04-7574-4aae-921b-84beb876849f","Type":"ContainerDied","Data":"14c71355d65ecb6c9f56a4511e8798166b28ba0593bf8c51d3f7d5c3c0a96991"} Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.409522 4781 scope.go:117] "RemoveContainer" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.410173 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7qb2s" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.436973 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.438349 4781 scope.go:117] "RemoveContainer" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.447128 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7qb2s"] Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.460454 4781 scope.go:117] "RemoveContainer" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.515509 4781 scope.go:117] "RemoveContainer" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.515993 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": container with ID starting with dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893 not found: ID does not exist" containerID="dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516060 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893"} err="failed to get container status \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": rpc error: code = NotFound desc = could not find container \"dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893\": container with ID starting with dd480eb85e60a95513a79719017fbbb2afca11eb776b342e1be056d226c99893 not found: ID does not exist" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516090 4781 scope.go:117] "RemoveContainer" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.516441 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": container with ID starting with c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103 not found: ID does not exist" containerID="c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516464 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103"} err="failed to get container status \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": rpc error: code = NotFound desc = could not find container \"c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103\": container with ID starting with c208faa6ef7cf6c598f0e1cf83bb5641066acee76521fb87a60110e85b787103 not found: ID does not exist" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516479 4781 scope.go:117] "RemoveContainer" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: E0227 00:38:07.516810 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": container with ID starting with 9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884 not found: ID does not exist" containerID="9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884" Feb 27 00:38:07 crc kubenswrapper[4781]: I0227 00:38:07.516848 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884"} err="failed to get container status \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": rpc error: code = NotFound desc = could not find container \"9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884\": container with ID starting with 9031e291266f83067134a797dbba80e693ac347bd5c2172ed7cd46a6cf10b884 not found: ID does not exist" Feb 27 00:38:09 crc kubenswrapper[4781]: I0227 00:38:09.322553 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711e5f04-7574-4aae-921b-84beb876849f" path="/var/lib/kubelet/pods/711e5f04-7574-4aae-921b-84beb876849f/volumes" Feb 27 00:38:17 crc kubenswrapper[4781]: I0227 00:38:17.519336 4781 generic.go:334] "Generic (PLEG): container finished" podID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerID="d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb" exitCode=0 Feb 27 00:38:17 crc kubenswrapper[4781]: I0227 00:38:17.519440 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerDied","Data":"d24a3b741e7a4b7b6e83691b9d42820d7bef593ed487e6d9b52037b61a1964eb"} Feb 27 00:38:18 crc kubenswrapper[4781]: I0227 00:38:18.312017 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:18 crc kubenswrapper[4781]: E0227 00:38:18.312704 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.096307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.225874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.226019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.226213 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") pod \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\" (UID: \"95533111-b2e6-41c2-b7b8-edc0a82e2ca5\") " Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.254817 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx" (OuterVolumeSpecName: "kube-api-access-wpltx") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "kube-api-access-wpltx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.257119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory" (OuterVolumeSpecName: "inventory") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.264675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "95533111-b2e6-41c2-b7b8-edc0a82e2ca5" (UID: "95533111-b2e6-41c2-b7b8-edc0a82e2ca5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328891 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpltx\" (UniqueName: \"kubernetes.io/projected/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-kube-api-access-wpltx\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328924 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.328936 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95533111-b2e6-41c2-b7b8-edc0a82e2ca5-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" event={"ID":"95533111-b2e6-41c2-b7b8-edc0a82e2ca5","Type":"ContainerDied","Data":"6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543"} Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539418 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6accd8b64d2459c3e4f34e1caa40c9f80e86200a6e054165b7c6c4d213fc4543" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.539421 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.617456 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.617994 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618018 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618039 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618049 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618072 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618081 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618096 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618103 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618116 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618124 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618133 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618140 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618157 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618164 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="extract-utilities" Feb 27 00:38:19 crc kubenswrapper[4781]: E0227 00:38:19.618188 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618197 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="extract-content" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618423 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" containerName="oc" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618439 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="711e5f04-7574-4aae-921b-84beb876849f" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618452 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="55171d71-37a1-422f-8209-3880be373d30" containerName="registry-server" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.618491 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95533111-b2e6-41c2-b7b8-edc0a82e2ca5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.619457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.628490 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.628895 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.629555 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.629759 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.634875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.634995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.635094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.647447 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736471 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.736614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.740559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.740729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.757502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:19 crc kubenswrapper[4781]: I0227 00:38:19.939079 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:20 crc kubenswrapper[4781]: I0227 00:38:20.482834 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j"] Feb 27 00:38:20 crc kubenswrapper[4781]: I0227 00:38:20.550439 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerStarted","Data":"8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080"} Feb 27 00:38:22 crc kubenswrapper[4781]: I0227 00:38:22.568167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerStarted","Data":"f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df"} Feb 27 00:38:22 crc kubenswrapper[4781]: I0227 00:38:22.591438 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" podStartSLOduration=2.3243168020000002 podStartE2EDuration="3.591418618s" podCreationTimestamp="2026-02-27 00:38:19 +0000 UTC" firstStartedPulling="2026-02-27 00:38:20.488252971 +0000 UTC m=+1969.745792525" lastFinishedPulling="2026-02-27 00:38:21.755354787 +0000 UTC m=+1971.012894341" observedRunningTime="2026-02-27 00:38:22.587420244 +0000 UTC m=+1971.844959798" watchObservedRunningTime="2026-02-27 00:38:22.591418618 +0000 UTC m=+1971.848958172" Feb 27 00:38:26 crc kubenswrapper[4781]: I0227 00:38:26.605914 4781 generic.go:334] "Generic (PLEG): container finished" podID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerID="f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df" exitCode=0 Feb 27 00:38:26 crc kubenswrapper[4781]: I0227 00:38:26.607477 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerDied","Data":"f63120de9863b31e6b1e80d8f68fb4bd43f35e4812c5407414823adca9d621df"} Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.162909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273316 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.273767 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") pod \"9f7ced88-662a-42f0-8385-97292a7f4ce4\" (UID: \"9f7ced88-662a-42f0-8385-97292a7f4ce4\") " Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.279801 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b" (OuterVolumeSpecName: "kube-api-access-xv52b") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "kube-api-access-xv52b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.305489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.307819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory" (OuterVolumeSpecName: "inventory") pod "9f7ced88-662a-42f0-8385-97292a7f4ce4" (UID: "9f7ced88-662a-42f0-8385-97292a7f4ce4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376362 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376402 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9f7ced88-662a-42f0-8385-97292a7f4ce4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.376414 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv52b\" (UniqueName: \"kubernetes.io/projected/9f7ced88-662a-42f0-8385-97292a7f4ce4-kube-api-access-xv52b\") on node \"crc\" DevicePath \"\"" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.652998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" event={"ID":"9f7ced88-662a-42f0-8385-97292a7f4ce4","Type":"ContainerDied","Data":"8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080"} Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.653063 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8440615a76d5270c1652e37a051e61f9bca649ede1374e62c9bf67b4732ac080" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.653089 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.823851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:28 crc kubenswrapper[4781]: E0227 00:38:28.824496 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.824514 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.824728 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7ced88-662a-42f0-8385-97292a7f4ce4" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.825470 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.827564 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.827572 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.828192 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.831828 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.844907 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888267 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.888864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.990854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.990917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.991036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:28 crc kubenswrapper[4781]: I0227 00:38:28.997484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.002517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.008947 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rrxrj\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.143514 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.725239 4781 scope.go:117] "RemoveContainer" containerID="d4ee7796e64f1964f0ab74414c33a59e4f95e98e4eb4a260e730590563ac50fe" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.729549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj"] Feb 27 00:38:29 crc kubenswrapper[4781]: W0227 00:38:29.735656 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29e8157f_b610_48f3_93ac_9173fa6d484a.slice/crio-76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e WatchSource:0}: Error finding container 76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e: Status 404 returned error can't find the container with id 76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.825376 4781 scope.go:117] "RemoveContainer" containerID="7d9a07674537261cb97d86282370b22b357712af922b31aea2a8cfe67e8a0a4c" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.872505 4781 scope.go:117] "RemoveContainer" containerID="90d3da646bb32391ad6c504fecd5db68f89221b28accf451c40b52dc228b7d89" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.921894 4781 scope.go:117] "RemoveContainer" containerID="89638f7647330ea3c5230d3d253e70beeda178adf35863cd73f9bfed5a1f6c4c" Feb 27 00:38:29 crc kubenswrapper[4781]: I0227 00:38:29.947068 4781 scope.go:117] "RemoveContainer" containerID="3dc1eb7dbdd6694e7292463c3972ed88e476b4fd179d083eaeff0cf57f961958" Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.310012 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:30 crc kubenswrapper[4781]: E0227 00:38:30.310481 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.672235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerStarted","Data":"707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b"} Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.672314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerStarted","Data":"76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e"} Feb 27 00:38:30 crc kubenswrapper[4781]: I0227 00:38:30.691796 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" podStartSLOduration=2.261710401 podStartE2EDuration="2.691779311s" podCreationTimestamp="2026-02-27 00:38:28 +0000 UTC" firstStartedPulling="2026-02-27 00:38:29.737397951 +0000 UTC m=+1978.994937495" lastFinishedPulling="2026-02-27 00:38:30.167466851 +0000 UTC m=+1979.425006405" observedRunningTime="2026-02-27 00:38:30.685501947 +0000 UTC m=+1979.943041501" watchObservedRunningTime="2026-02-27 00:38:30.691779311 +0000 UTC m=+1979.949318865" Feb 27 00:38:34 crc kubenswrapper[4781]: I0227 00:38:34.041358 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:38:34 crc kubenswrapper[4781]: I0227 00:38:34.052506 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kcmlj"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.041033 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.057128 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lgv6f"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.074143 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.083136 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.091554 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.103101 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3e-account-create-update-dkxt7"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.114438 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.124363 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9245-account-create-update-j6hsh"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.132425 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-141e-account-create-update-msmcr"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.140841 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qx8nd"] Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.356864 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4dbafa-fefb-4947-8d71-f7b0057a2ba0" path="/var/lib/kubelet/pods/2b4dbafa-fefb-4947-8d71-f7b0057a2ba0/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.364026 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce" path="/var/lib/kubelet/pods/5b9af6a0-49e8-462c-80d6-df8a3d3bd4ce/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.365099 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6795d880-5f00-4be4-9c67-6f8a251550cb" path="/var/lib/kubelet/pods/6795d880-5f00-4be4-9c67-6f8a251550cb/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.365836 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7468389a-cc9b-404c-9414-4d81f3b1a7e5" path="/var/lib/kubelet/pods/7468389a-cc9b-404c-9414-4d81f3b1a7e5/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.366468 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0e335a-e4a1-48ee-b470-a6277acc5dae" path="/var/lib/kubelet/pods/7f0e335a-e4a1-48ee-b470-a6277acc5dae/volumes" Feb 27 00:38:35 crc kubenswrapper[4781]: I0227 00:38:35.384660 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f8e017-da89-4ce0-a5b7-2339b2cf18a5" path="/var/lib/kubelet/pods/c2f8e017-da89-4ce0-a5b7-2339b2cf18a5/volumes" Feb 27 00:38:45 crc kubenswrapper[4781]: I0227 00:38:45.310174 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:45 crc kubenswrapper[4781]: E0227 00:38:45.310913 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:38:57 crc kubenswrapper[4781]: I0227 00:38:57.310444 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:38:57 crc kubenswrapper[4781]: E0227 00:38:57.311801 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:01 crc kubenswrapper[4781]: I0227 00:39:01.942503 4781 generic.go:334] "Generic (PLEG): container finished" podID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerID="707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b" exitCode=0 Feb 27 00:39:01 crc kubenswrapper[4781]: I0227 00:39:01.942596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerDied","Data":"707f4b210ebda3b76fb1a923983ddc8d3406d8cc5d5249610b9d6a6d1ce7e10b"} Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.043503 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.058657 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9cntr"] Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.327041 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71a5c1e-7953-4acf-813a-0d96d4992d1f" path="/var/lib/kubelet/pods/d71a5c1e-7953-4acf-813a-0d96d4992d1f/volumes" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.500057 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.643922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.644054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.644090 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") pod \"29e8157f-b610-48f3-93ac-9173fa6d484a\" (UID: \"29e8157f-b610-48f3-93ac-9173fa6d484a\") " Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.650858 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn" (OuterVolumeSpecName: "kube-api-access-gh2pn") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "kube-api-access-gh2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.687790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.692278 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory" (OuterVolumeSpecName: "inventory") pod "29e8157f-b610-48f3-93ac-9173fa6d484a" (UID: "29e8157f-b610-48f3-93ac-9173fa6d484a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746620 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh2pn\" (UniqueName: \"kubernetes.io/projected/29e8157f-b610-48f3-93ac-9173fa6d484a-kube-api-access-gh2pn\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746671 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.746685 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e8157f-b610-48f3-93ac-9173fa6d484a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964283 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" event={"ID":"29e8157f-b610-48f3-93ac-9173fa6d484a","Type":"ContainerDied","Data":"76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e"} Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964699 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76301ff13c2e5a35dd505b8b42a308f0dea15e75e138c72c3fd0670cba71e23e" Feb 27 00:39:03 crc kubenswrapper[4781]: I0227 00:39:03.964763 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rrxrj" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.094600 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: E0227 00:39:04.095102 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.095116 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.095330 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e8157f-b610-48f3-93ac-9173fa6d484a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.096170 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101183 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101278 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.101928 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.102973 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.115552 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.153793 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.153860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.154096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256186 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.256218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.268549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.269134 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.273875 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qtql5\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.415599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.958327 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5"] Feb 27 00:39:04 crc kubenswrapper[4781]: W0227 00:39:04.960734 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb05a1d9c_7887_4173_99fe_97f7c89cc555.slice/crio-22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141 WatchSource:0}: Error finding container 22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141: Status 404 returned error can't find the container with id 22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141 Feb 27 00:39:04 crc kubenswrapper[4781]: I0227 00:39:04.980286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerStarted","Data":"22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141"} Feb 27 00:39:05 crc kubenswrapper[4781]: I0227 00:39:05.989960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerStarted","Data":"fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455"} Feb 27 00:39:06 crc kubenswrapper[4781]: I0227 00:39:06.011876 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" podStartSLOduration=1.573851784 podStartE2EDuration="2.011857493s" podCreationTimestamp="2026-02-27 00:39:04 +0000 UTC" firstStartedPulling="2026-02-27 00:39:04.964868684 +0000 UTC m=+2014.222408228" lastFinishedPulling="2026-02-27 00:39:05.402874383 +0000 UTC m=+2014.660413937" observedRunningTime="2026-02-27 00:39:06.004565672 +0000 UTC m=+2015.262105256" watchObservedRunningTime="2026-02-27 00:39:06.011857493 +0000 UTC m=+2015.269397057" Feb 27 00:39:12 crc kubenswrapper[4781]: I0227 00:39:12.310077 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:12 crc kubenswrapper[4781]: E0227 00:39:12.310902 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:22 crc kubenswrapper[4781]: I0227 00:39:22.046282 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:39:22 crc kubenswrapper[4781]: I0227 00:39:22.055873 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qjkwv"] Feb 27 00:39:23 crc kubenswrapper[4781]: I0227 00:39:23.329732 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd521dc6-4126-4c51-8634-66db8ba1412e" path="/var/lib/kubelet/pods/cd521dc6-4126-4c51-8634-66db8ba1412e/volumes" Feb 27 00:39:26 crc kubenswrapper[4781]: I0227 00:39:26.309960 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:26 crc kubenswrapper[4781]: E0227 00:39:26.310227 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.045583 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.059948 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tg9k8"] Feb 27 00:39:29 crc kubenswrapper[4781]: I0227 00:39:29.321496 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b607db2c-2aa3-48f0-9cd8-c5461797431c" path="/var/lib/kubelet/pods/b607db2c-2aa3-48f0-9cd8-c5461797431c/volumes" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.093838 4781 scope.go:117] "RemoveContainer" containerID="39276ac01bb5ee770105ba2bf75f8d61d8081e22c89cdaa97c9f7ed7f2722110" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.168854 4781 scope.go:117] "RemoveContainer" containerID="a4bad047d90bd3b11bea212cddee0782007013387656451beeca5b44aee50150" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.236552 4781 scope.go:117] "RemoveContainer" containerID="12e5844f351b3d039dc82ba98df27afa29e4eaea9f5b2ec45b3c8cb5d018e0ca" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.261477 4781 scope.go:117] "RemoveContainer" containerID="ae3d06d551b95e82732253f74b171a292fd2201889c2e3a5a620c3b16fb394dd" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.312105 4781 scope.go:117] "RemoveContainer" containerID="c9388f02af5b31dc8f5e8ea62ee66fb19cbab695e94e5d03ed46c036e292ce69" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.374718 4781 scope.go:117] "RemoveContainer" containerID="a7acf67e842e66e4a577e00cfd7561f83ca973cea54d959ed8fb7c9427da2a89" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.404017 4781 scope.go:117] "RemoveContainer" containerID="feae0a2cae038402fdacbd138e93b4a28e83ea37dfdf069227fa89f2c8eea228" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.423173 4781 scope.go:117] "RemoveContainer" containerID="24536e1e89dfec02307e517e9566052e3516ec64369f8d65d2939b8e4650f889" Feb 27 00:39:30 crc kubenswrapper[4781]: I0227 00:39:30.443132 4781 scope.go:117] "RemoveContainer" containerID="e064657ef0c106a3592f283bb81ae42d2444dda1caced8f721f45cdcfe863108" Feb 27 00:39:37 crc kubenswrapper[4781]: I0227 00:39:37.309980 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:37 crc kubenswrapper[4781]: E0227 00:39:37.310701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:39:49 crc kubenswrapper[4781]: I0227 00:39:49.463831 4781 generic.go:334] "Generic (PLEG): container finished" podID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerID="fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455" exitCode=0 Feb 27 00:39:49 crc kubenswrapper[4781]: I0227 00:39:49.464058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerDied","Data":"fdbd60c17b361428ab3bb4e0269dbd498da5588801dd4a7ab30556bebd16a455"} Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.005884 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104145 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104197 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.104314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") pod \"b05a1d9c-7887-4173-99fe-97f7c89cc555\" (UID: \"b05a1d9c-7887-4173-99fe-97f7c89cc555\") " Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.110868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw" (OuterVolumeSpecName: "kube-api-access-npqcw") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "kube-api-access-npqcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.135346 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.147802 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory" (OuterVolumeSpecName: "inventory") pod "b05a1d9c-7887-4173-99fe-97f7c89cc555" (UID: "b05a1d9c-7887-4173-99fe-97f7c89cc555"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206223 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npqcw\" (UniqueName: \"kubernetes.io/projected/b05a1d9c-7887-4173-99fe-97f7c89cc555-kube-api-access-npqcw\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206260 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.206270 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b05a1d9c-7887-4173-99fe-97f7c89cc555-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481868 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" event={"ID":"b05a1d9c-7887-4173-99fe-97f7c89cc555","Type":"ContainerDied","Data":"22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141"} Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481910 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e6fedd890698c3f2870ebe4935123b347d59707f469cb36c3175709bf38141" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.481963 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qtql5" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.584557 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:51 crc kubenswrapper[4781]: E0227 00:39:51.584982 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.585031 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.585247 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05a1d9c-7887-4173-99fe-97f7c89cc555" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.586045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.589269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590158 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590187 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.590504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.594990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.723572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.826402 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.830059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.832295 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.843533 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"ssh-known-hosts-edpm-deployment-vvrmt\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:51 crc kubenswrapper[4781]: I0227 00:39:51.909953 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.309334 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.425072 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vvrmt"] Feb 27 00:39:52 crc kubenswrapper[4781]: I0227 00:39:52.491867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerStarted","Data":"ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de"} Feb 27 00:39:53 crc kubenswrapper[4781]: I0227 00:39:53.508498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} Feb 27 00:39:54 crc kubenswrapper[4781]: I0227 00:39:54.519204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerStarted","Data":"b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076"} Feb 27 00:39:54 crc kubenswrapper[4781]: I0227 00:39:54.539126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" podStartSLOduration=2.910643313 podStartE2EDuration="3.539101703s" podCreationTimestamp="2026-02-27 00:39:51 +0000 UTC" firstStartedPulling="2026-02-27 00:39:52.442930739 +0000 UTC m=+2061.700470293" lastFinishedPulling="2026-02-27 00:39:53.071389129 +0000 UTC m=+2062.328928683" observedRunningTime="2026-02-27 00:39:54.533605519 +0000 UTC m=+2063.791145073" watchObservedRunningTime="2026-02-27 00:39:54.539101703 +0000 UTC m=+2063.796641247" Feb 27 00:39:59 crc kubenswrapper[4781]: I0227 00:39:59.566404 4781 generic.go:334] "Generic (PLEG): container finished" podID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerID="b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076" exitCode=0 Feb 27 00:39:59 crc kubenswrapper[4781]: I0227 00:39:59.566497 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerDied","Data":"b5b184577d5049b034be6a6f63b1b866cbf4d799620d3da5c03b7145ebd8f076"} Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.140083 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.142287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.144906 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.145333 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.145420 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.215254 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.217488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.319068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.337368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"auto-csr-approver-29535880-9cpwk\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:00 crc kubenswrapper[4781]: I0227 00:40:00.530697 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.029220 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.212419 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347683 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347824 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.347915 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") pod \"35b9cf19-a1cd-48b5-9072-d5c71680c892\" (UID: \"35b9cf19-a1cd-48b5-9072-d5c71680c892\") " Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.352932 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq" (OuterVolumeSpecName: "kube-api-access-lcbjq") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "kube-api-access-lcbjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.375394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.376457 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35b9cf19-a1cd-48b5-9072-d5c71680c892" (UID: "35b9cf19-a1cd-48b5-9072-d5c71680c892"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451003 4781 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451041 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcbjq\" (UniqueName: \"kubernetes.io/projected/35b9cf19-a1cd-48b5-9072-d5c71680c892-kube-api-access-lcbjq\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.451057 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b9cf19-a1cd-48b5-9072-d5c71680c892-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.610987 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.610977 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vvrmt" event={"ID":"35b9cf19-a1cd-48b5-9072-d5c71680c892","Type":"ContainerDied","Data":"ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de"} Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.611684 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca371ab0a523aa64c50591b46a3e97f3b89b4de31e340ed13a5023bcc93c87de" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.614669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerStarted","Data":"d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90"} Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658017 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:01 crc kubenswrapper[4781]: E0227 00:40:01.658427 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658445 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.658685 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b9cf19-a1cd-48b5-9072-d5c71680c892" containerName="ssh-known-hosts-edpm-deployment" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.661355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.668992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.717771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718004 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718077 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.718505 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.756838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.756900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.757031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859336 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859417 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.859489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.864502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.867588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:01 crc kubenswrapper[4781]: I0227 00:40:01.875524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-n6nts\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.049572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.608414 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts"] Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.630357 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerStarted","Data":"f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12"} Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.632875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerStarted","Data":"6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c"} Feb 27 00:40:02 crc kubenswrapper[4781]: I0227 00:40:02.648528 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" podStartSLOduration=1.430176149 podStartE2EDuration="2.648511212s" podCreationTimestamp="2026-02-27 00:40:00 +0000 UTC" firstStartedPulling="2026-02-27 00:40:01.041178112 +0000 UTC m=+2070.298717666" lastFinishedPulling="2026-02-27 00:40:02.259513175 +0000 UTC m=+2071.517052729" observedRunningTime="2026-02-27 00:40:02.647939857 +0000 UTC m=+2071.905479421" watchObservedRunningTime="2026-02-27 00:40:02.648511212 +0000 UTC m=+2071.906050766" Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.652549 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerStarted","Data":"681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180"} Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.656442 4781 generic.go:334] "Generic (PLEG): container finished" podID="93fc175b-7238-41ec-91f7-17cc07188100" containerID="f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12" exitCode=0 Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.656488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerDied","Data":"f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12"} Feb 27 00:40:03 crc kubenswrapper[4781]: I0227 00:40:03.713644 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" podStartSLOduration=2.2799835010000002 podStartE2EDuration="2.713598364s" podCreationTimestamp="2026-02-27 00:40:01 +0000 UTC" firstStartedPulling="2026-02-27 00:40:02.590961859 +0000 UTC m=+2071.848501413" lastFinishedPulling="2026-02-27 00:40:03.024576712 +0000 UTC m=+2072.282116276" observedRunningTime="2026-02-27 00:40:03.696398454 +0000 UTC m=+2072.953938008" watchObservedRunningTime="2026-02-27 00:40:03.713598364 +0000 UTC m=+2072.971137928" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.207452 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.338956 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") pod \"93fc175b-7238-41ec-91f7-17cc07188100\" (UID: \"93fc175b-7238-41ec-91f7-17cc07188100\") " Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.364536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh" (OuterVolumeSpecName: "kube-api-access-bc9gh") pod "93fc175b-7238-41ec-91f7-17cc07188100" (UID: "93fc175b-7238-41ec-91f7-17cc07188100"). InnerVolumeSpecName "kube-api-access-bc9gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.442293 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc9gh\" (UniqueName: \"kubernetes.io/projected/93fc175b-7238-41ec-91f7-17cc07188100-kube-api-access-bc9gh\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" event={"ID":"93fc175b-7238-41ec-91f7-17cc07188100","Type":"ContainerDied","Data":"d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90"} Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676419 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535880-9cpwk" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.676427 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64226a21c9c3afbfc96f1c1e82063d6bba2c61ceba988ed9bedd7298eca0e90" Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.723763 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:40:05 crc kubenswrapper[4781]: I0227 00:40:05.732295 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535874-9b4fw"] Feb 27 00:40:07 crc kubenswrapper[4781]: I0227 00:40:07.321820 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bdad75-a7e5-4940-9ee3-be513a55b97d" path="/var/lib/kubelet/pods/21bdad75-a7e5-4940-9ee3-be513a55b97d/volumes" Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.030673 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.040458 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6twxl"] Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.740718 4781 generic.go:334] "Generic (PLEG): container finished" podID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerID="681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180" exitCode=0 Feb 27 00:40:10 crc kubenswrapper[4781]: I0227 00:40:10.740807 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerDied","Data":"681e0ec5aee424b20955c1f2f0d9d1da7fd9f3929df7cef80074d17dd5991180"} Feb 27 00:40:11 crc kubenswrapper[4781]: I0227 00:40:11.325385 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b0d2a5-5629-42a0-8884-a5534240b356" path="/var/lib/kubelet/pods/27b0d2a5-5629-42a0-8884-a5534240b356/volumes" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.256950 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288225 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.288846 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") pod \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\" (UID: \"2a7f1888-0c26-47e0-91b4-fbf07824cab4\") " Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.325475 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb" (OuterVolumeSpecName: "kube-api-access-t8qqb") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "kube-api-access-t8qqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.330728 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory" (OuterVolumeSpecName: "inventory") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.331256 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a7f1888-0c26-47e0-91b4-fbf07824cab4" (UID: "2a7f1888-0c26-47e0-91b4-fbf07824cab4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392277 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392311 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qqb\" (UniqueName: \"kubernetes.io/projected/2a7f1888-0c26-47e0-91b4-fbf07824cab4-kube-api-access-t8qqb\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.392328 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a7f1888-0c26-47e0-91b4-fbf07824cab4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" event={"ID":"2a7f1888-0c26-47e0-91b4-fbf07824cab4","Type":"ContainerDied","Data":"6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c"} Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761597 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6121ec95ad5ae95181b3e1d0c2b155e2501f4100e8c01f3026d3c448fdecde2c" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.761619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-n6nts" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848121 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:12 crc kubenswrapper[4781]: E0227 00:40:12.848698 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848720 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: E0227 00:40:12.848760 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.848771 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.849053 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7f1888-0c26-47e0-91b4-fbf07824cab4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.849079 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fc175b-7238-41ec-91f7-17cc07188100" containerName="oc" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.850082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852310 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852787 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.852976 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.856029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.857916 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:12 crc kubenswrapper[4781]: I0227 00:40:12.903777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006121 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006278 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.006328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.010995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.011317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.024704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.186760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.764826 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz"] Feb 27 00:40:13 crc kubenswrapper[4781]: I0227 00:40:13.773412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerStarted","Data":"5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f"} Feb 27 00:40:15 crc kubenswrapper[4781]: I0227 00:40:15.792975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerStarted","Data":"81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe"} Feb 27 00:40:15 crc kubenswrapper[4781]: I0227 00:40:15.812011 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" podStartSLOduration=2.89051899 podStartE2EDuration="3.81199201s" podCreationTimestamp="2026-02-27 00:40:12 +0000 UTC" firstStartedPulling="2026-02-27 00:40:13.763702157 +0000 UTC m=+2083.021241711" lastFinishedPulling="2026-02-27 00:40:14.685175167 +0000 UTC m=+2083.942714731" observedRunningTime="2026-02-27 00:40:15.808183161 +0000 UTC m=+2085.065722715" watchObservedRunningTime="2026-02-27 00:40:15.81199201 +0000 UTC m=+2085.069531564" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.421878 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.424614 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.433016 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.534543 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.636847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.636904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.637957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.661676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"redhat-operators-ngjmc\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.745959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.881440 4781 generic.go:334] "Generic (PLEG): container finished" podID="98c901e2-eff5-4256-9add-25d09beb51e3" containerID="81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe" exitCode=0 Feb 27 00:40:23 crc kubenswrapper[4781]: I0227 00:40:23.881498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerDied","Data":"81556b2b1512e6b2cac6ee77543475833768d132f92f55999253cefef07fe4fe"} Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.233492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891297 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" exitCode=0 Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914"} Feb 27 00:40:24 crc kubenswrapper[4781]: I0227 00:40:24.891707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"c55aee4887a25d0bd23791f9a694c6155337621e7dac3a4e5f392a9f73d0d36d"} Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.401001 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482143 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.482186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") pod \"98c901e2-eff5-4256-9add-25d09beb51e3\" (UID: \"98c901e2-eff5-4256-9add-25d09beb51e3\") " Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.487365 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n" (OuterVolumeSpecName: "kube-api-access-8kl6n") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "kube-api-access-8kl6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.510203 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.515126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory" (OuterVolumeSpecName: "inventory") pod "98c901e2-eff5-4256-9add-25d09beb51e3" (UID: "98c901e2-eff5-4256-9add-25d09beb51e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585332 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585373 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98c901e2-eff5-4256-9add-25d09beb51e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.585398 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kl6n\" (UniqueName: \"kubernetes.io/projected/98c901e2-eff5-4256-9add-25d09beb51e3-kube-api-access-8kl6n\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" event={"ID":"98c901e2-eff5-4256-9add-25d09beb51e3","Type":"ContainerDied","Data":"5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f"} Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904485 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c51816116cb2a00768333129536aa3bda367f597a1b5a8af5d31966b94ebe8f" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.904145 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz" Feb 27 00:40:25 crc kubenswrapper[4781]: I0227 00:40:25.908085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.029484 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: E0227 00:40:26.030200 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.030223 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.030482 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c901e2-eff5-4256-9add-25d09beb51e3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.031300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034281 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034776 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034911 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.034944 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.035655 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.035941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.036309 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.036498 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.043291 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.094933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095108 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095181 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095450 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.095615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197676 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197918 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.197999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198274 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198670 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.198820 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.204707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205482 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.205733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.206811 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.207606 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.207870 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.216377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fx894\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.351261 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.881760 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894"] Feb 27 00:40:26 crc kubenswrapper[4781]: W0227 00:40:26.882185 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dace61f_2e30_4132_9ce6_1cb1c8a6cedc.slice/crio-fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e WatchSource:0}: Error finding container fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e: Status 404 returned error can't find the container with id fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e Feb 27 00:40:26 crc kubenswrapper[4781]: I0227 00:40:26.922285 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerStarted","Data":"fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e"} Feb 27 00:40:29 crc kubenswrapper[4781]: I0227 00:40:29.954166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerStarted","Data":"963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7"} Feb 27 00:40:29 crc kubenswrapper[4781]: I0227 00:40:29.984996 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" podStartSLOduration=2.017666099 podStartE2EDuration="3.984970239s" podCreationTimestamp="2026-02-27 00:40:26 +0000 UTC" firstStartedPulling="2026-02-27 00:40:26.885881057 +0000 UTC m=+2096.143420611" lastFinishedPulling="2026-02-27 00:40:28.853185197 +0000 UTC m=+2098.110724751" observedRunningTime="2026-02-27 00:40:29.97660397 +0000 UTC m=+2099.234143544" watchObservedRunningTime="2026-02-27 00:40:29.984970239 +0000 UTC m=+2099.242509793" Feb 27 00:40:30 crc kubenswrapper[4781]: I0227 00:40:30.643760 4781 scope.go:117] "RemoveContainer" containerID="603be41f44dabcefd367f03b819f0e12526431539cc454d1e0a0fbbe4c354d4e" Feb 27 00:40:30 crc kubenswrapper[4781]: I0227 00:40:30.708511 4781 scope.go:117] "RemoveContainer" containerID="172b3310c26572010bb7e76f998ac931b571b090edac45e7e85d3b3c5cd6c47d" Feb 27 00:40:34 crc kubenswrapper[4781]: I0227 00:40:34.998882 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" exitCode=0 Feb 27 00:40:34 crc kubenswrapper[4781]: I0227 00:40:34.998963 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} Feb 27 00:40:36 crc kubenswrapper[4781]: I0227 00:40:36.012747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerStarted","Data":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} Feb 27 00:40:36 crc kubenswrapper[4781]: I0227 00:40:36.043543 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngjmc" podStartSLOduration=2.534584912 podStartE2EDuration="13.043519367s" podCreationTimestamp="2026-02-27 00:40:23 +0000 UTC" firstStartedPulling="2026-02-27 00:40:24.893318648 +0000 UTC m=+2094.150858202" lastFinishedPulling="2026-02-27 00:40:35.402253103 +0000 UTC m=+2104.659792657" observedRunningTime="2026-02-27 00:40:36.034265126 +0000 UTC m=+2105.291804690" watchObservedRunningTime="2026-02-27 00:40:36.043519367 +0000 UTC m=+2105.301058921" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.746742 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.747213 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:43 crc kubenswrapper[4781]: I0227 00:40:43.794700 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:44 crc kubenswrapper[4781]: I0227 00:40:44.152570 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:44 crc kubenswrapper[4781]: I0227 00:40:44.201715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.105792 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngjmc" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" containerID="cri-o://1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" gracePeriod=2 Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.673564 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.842991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.843546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.843680 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") pod \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\" (UID: \"c5ccb94d-a0c4-4247-85cc-76049a84eef6\") " Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.844324 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities" (OuterVolumeSpecName: "utilities") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.852532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97" (OuterVolumeSpecName: "kube-api-access-b8v97") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "kube-api-access-b8v97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.945447 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8v97\" (UniqueName: \"kubernetes.io/projected/c5ccb94d-a0c4-4247-85cc-76049a84eef6-kube-api-access-b8v97\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.945486 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:46 crc kubenswrapper[4781]: I0227 00:40:46.980562 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5ccb94d-a0c4-4247-85cc-76049a84eef6" (UID: "c5ccb94d-a0c4-4247-85cc-76049a84eef6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.047489 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5ccb94d-a0c4-4247-85cc-76049a84eef6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118787 4781 generic.go:334] "Generic (PLEG): container finished" podID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" exitCode=0 Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngjmc" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.118994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngjmc" event={"ID":"c5ccb94d-a0c4-4247-85cc-76049a84eef6","Type":"ContainerDied","Data":"c55aee4887a25d0bd23791f9a694c6155337621e7dac3a4e5f392a9f73d0d36d"} Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.119022 4781 scope.go:117] "RemoveContainer" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.150087 4781 scope.go:117] "RemoveContainer" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.166419 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.175917 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngjmc"] Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.182861 4781 scope.go:117] "RemoveContainer" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.233600 4781 scope.go:117] "RemoveContainer" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234034 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": container with ID starting with 1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058 not found: ID does not exist" containerID="1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234065 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058"} err="failed to get container status \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": rpc error: code = NotFound desc = could not find container \"1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058\": container with ID starting with 1acf12a7d06cdbf5094e9c421c17612a1f2bdee30903a59862aa98eb4e307058 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234086 4781 scope.go:117] "RemoveContainer" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234346 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": container with ID starting with 1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73 not found: ID does not exist" containerID="1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234367 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73"} err="failed to get container status \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": rpc error: code = NotFound desc = could not find container \"1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73\": container with ID starting with 1ec68491988558a85bedfb2c63c5c907bbcba863f54502344fb220a9f95a6f73 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234384 4781 scope.go:117] "RemoveContainer" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: E0227 00:40:47.234692 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": container with ID starting with 1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914 not found: ID does not exist" containerID="1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.234710 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914"} err="failed to get container status \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": rpc error: code = NotFound desc = could not find container \"1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914\": container with ID starting with 1d21840f6fe3859320b67c6fb1a962662544c060f8285bedfa75a79b0244e914 not found: ID does not exist" Feb 27 00:40:47 crc kubenswrapper[4781]: I0227 00:40:47.342707 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" path="/var/lib/kubelet/pods/c5ccb94d-a0c4-4247-85cc-76049a84eef6/volumes" Feb 27 00:41:02 crc kubenswrapper[4781]: I0227 00:41:02.292986 4781 generic.go:334] "Generic (PLEG): container finished" podID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerID="963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7" exitCode=0 Feb 27 00:41:02 crc kubenswrapper[4781]: I0227 00:41:02.293099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerDied","Data":"963155c7618099d355cb8c863003fa537b8c82a66251e59d4f497102028cdca7"} Feb 27 00:41:03 crc kubenswrapper[4781]: I0227 00:41:03.872363 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022344 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022403 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022439 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022606 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022647 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022684 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.022805 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") pod \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\" (UID: \"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc\") " Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.036986 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55" (OuterVolumeSpecName: "kube-api-access-kkm55") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "kube-api-access-kkm55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.039040 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040358 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040415 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040623 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.040955 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.041613 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.041886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.044248 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.066589 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory" (OuterVolumeSpecName: "inventory") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.077466 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" (UID: "0dace61f-2e30-4132-9ce6-1cb1c8a6cedc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125024 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125066 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125080 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125091 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125100 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125109 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125119 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125128 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125136 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125145 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125153 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125161 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkm55\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-kube-api-access-kkm55\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125169 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.125182 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dace61f-2e30-4132-9ce6-1cb1c8a6cedc-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315753 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" event={"ID":"0dace61f-2e30-4132-9ce6-1cb1c8a6cedc","Type":"ContainerDied","Data":"fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e"} Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315801 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdbe6dbed5dcbc407181e019c6042c54a0b20057226462faf8d516b97be9c31e" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.315818 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fx894" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.423795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424322 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-utilities" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424342 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-utilities" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424369 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-content" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424377 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="extract-content" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424393 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424402 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: E0227 00:41:04.424414 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424421 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424717 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dace61f-2e30-4132-9ce6-1cb1c8a6cedc" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.424748 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ccb94d-a0c4-4247-85cc-76049a84eef6" containerName="registry-server" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.425691 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.429177 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.429952 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.430297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.431619 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.432928 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.435959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532847 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.532884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.533004 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.533073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.635390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.636559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.640056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.642961 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.646920 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.656340 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-27lcw\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:04 crc kubenswrapper[4781]: I0227 00:41:04.741955 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:41:05 crc kubenswrapper[4781]: I0227 00:41:05.261275 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw"] Feb 27 00:41:05 crc kubenswrapper[4781]: I0227 00:41:05.336140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerStarted","Data":"51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12"} Feb 27 00:41:06 crc kubenswrapper[4781]: I0227 00:41:06.346908 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerStarted","Data":"6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5"} Feb 27 00:41:06 crc kubenswrapper[4781]: I0227 00:41:06.372653 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" podStartSLOduration=1.946775497 podStartE2EDuration="2.372618167s" podCreationTimestamp="2026-02-27 00:41:04 +0000 UTC" firstStartedPulling="2026-02-27 00:41:05.264176523 +0000 UTC m=+2134.521716077" lastFinishedPulling="2026-02-27 00:41:05.690019193 +0000 UTC m=+2134.947558747" observedRunningTime="2026-02-27 00:41:06.36660332 +0000 UTC m=+2135.624142874" watchObservedRunningTime="2026-02-27 00:41:06.372618167 +0000 UTC m=+2135.630157721" Feb 27 00:41:24 crc kubenswrapper[4781]: I0227 00:41:24.043813 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:41:24 crc kubenswrapper[4781]: I0227 00:41:24.055504 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-6wz7g"] Feb 27 00:41:25 crc kubenswrapper[4781]: I0227 00:41:25.670795 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b669382e-dffc-421d-80a3-82b928f54044" path="/var/lib/kubelet/pods/b669382e-dffc-421d-80a3-82b928f54044/volumes" Feb 27 00:41:30 crc kubenswrapper[4781]: I0227 00:41:30.804532 4781 scope.go:117] "RemoveContainer" containerID="08009d33d7dd60364f173703aa207fb7fe65cb10f22855e575d2a1e3d49e40a0" Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.026112 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.034988 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-c5vn9"] Feb 27 00:41:31 crc kubenswrapper[4781]: I0227 00:41:31.322229 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee23b33-5d55-45c9-b024-0b4865019095" path="/var/lib/kubelet/pods/fee23b33-5d55-45c9-b024-0b4865019095/volumes" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.155561 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.159064 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.161171 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.161532 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.162177 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.172281 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.303257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.405681 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.425397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"auto-csr-approver-29535882-skl65\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.489811 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:00 crc kubenswrapper[4781]: I0227 00:42:00.991549 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:42:01 crc kubenswrapper[4781]: I0227 00:42:01.030543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerStarted","Data":"68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.063528 4781 generic.go:334] "Generic (PLEG): container finished" podID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerID="6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5" exitCode=0 Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.063652 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerDied","Data":"6ed88a448040dd872eaf65d70e7642dc99ecfe2b0ddaf21643e90282bdc141d5"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.066412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerStarted","Data":"bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb"} Feb 27 00:42:03 crc kubenswrapper[4781]: I0227 00:42:03.091894 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535882-skl65" podStartSLOduration=1.294506254 podStartE2EDuration="3.091875121s" podCreationTimestamp="2026-02-27 00:42:00 +0000 UTC" firstStartedPulling="2026-02-27 00:42:00.998616758 +0000 UTC m=+2190.256156312" lastFinishedPulling="2026-02-27 00:42:02.795985615 +0000 UTC m=+2192.053525179" observedRunningTime="2026-02-27 00:42:03.091111431 +0000 UTC m=+2192.348651005" watchObservedRunningTime="2026-02-27 00:42:03.091875121 +0000 UTC m=+2192.349414675" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.081515 4781 generic.go:334] "Generic (PLEG): container finished" podID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerID="bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb" exitCode=0 Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.081652 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerDied","Data":"bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb"} Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.695780 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.706519 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.706710 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708343 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.708501 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") pod \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\" (UID: \"e61bcd0e-2490-4f8e-a429-cf07405dc01b\") " Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.713584 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.716073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl" (OuterVolumeSpecName: "kube-api-access-4xkxl") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "kube-api-access-4xkxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.743181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.753148 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.767267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory" (OuterVolumeSpecName: "inventory") pod "e61bcd0e-2490-4f8e-a429-cf07405dc01b" (UID: "e61bcd0e-2490-4f8e-a429-cf07405dc01b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811932 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811976 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.811990 4781 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.812002 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xkxl\" (UniqueName: \"kubernetes.io/projected/e61bcd0e-2490-4f8e-a429-cf07405dc01b-kube-api-access-4xkxl\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:04 crc kubenswrapper[4781]: I0227 00:42:04.812015 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61bcd0e-2490-4f8e-a429-cf07405dc01b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" event={"ID":"e61bcd0e-2490-4f8e-a429-cf07405dc01b","Type":"ContainerDied","Data":"51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12"} Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093725 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-27lcw" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.093739 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c501825e1346a8a2e129063a87294aecfd1918c864ddba2f164fb624184d12" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.157427 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:05 crc kubenswrapper[4781]: E0227 00:42:05.158421 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.158458 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.158914 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61bcd0e-2490-4f8e-a429-cf07405dc01b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.161195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.163113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.165853 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166293 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.166559 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.168070 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.175219 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.220830 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324716 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.324917 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.329879 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.330505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.330769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.331660 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.338263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.343502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.497327 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.501454 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.528937 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") pod \"29db339c-88ad-410b-bad1-e5f5328e9a0a\" (UID: \"29db339c-88ad-410b-bad1-e5f5328e9a0a\") " Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.534570 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr" (OuterVolumeSpecName: "kube-api-access-vhcrr") pod "29db339c-88ad-410b-bad1-e5f5328e9a0a" (UID: "29db339c-88ad-410b-bad1-e5f5328e9a0a"). InnerVolumeSpecName "kube-api-access-vhcrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:05 crc kubenswrapper[4781]: I0227 00:42:05.632540 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhcrr\" (UniqueName: \"kubernetes.io/projected/29db339c-88ad-410b-bad1-e5f5328e9a0a-kube-api-access-vhcrr\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.086476 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq"] Feb 27 00:42:06 crc kubenswrapper[4781]: W0227 00:42:06.088831 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a3e8437_2d3f_44a9_bb1a_8b3de1e91c87.slice/crio-e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2 WatchSource:0}: Error finding container e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2: Status 404 returned error can't find the container with id e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2 Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535882-skl65" event={"ID":"29db339c-88ad-410b-bad1-e5f5328e9a0a","Type":"ContainerDied","Data":"68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af"} Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108083 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c1340bfed486bf6f77881531a13b2ec6ce5bdb5c017e706d1af3feb87c99af" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.108051 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535882-skl65" Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.112948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerStarted","Data":"e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2"} Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.166673 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:42:06 crc kubenswrapper[4781]: I0227 00:42:06.176644 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535876-2l88l"] Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.123378 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerStarted","Data":"e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f"} Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.145933 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" podStartSLOduration=1.749421307 podStartE2EDuration="2.145914081s" podCreationTimestamp="2026-02-27 00:42:05 +0000 UTC" firstStartedPulling="2026-02-27 00:42:06.093776234 +0000 UTC m=+2195.351315788" lastFinishedPulling="2026-02-27 00:42:06.490269008 +0000 UTC m=+2195.747808562" observedRunningTime="2026-02-27 00:42:07.140360645 +0000 UTC m=+2196.397900199" watchObservedRunningTime="2026-02-27 00:42:07.145914081 +0000 UTC m=+2196.403453635" Feb 27 00:42:07 crc kubenswrapper[4781]: I0227 00:42:07.348480 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9301966-9820-4623-8393-f185a0616743" path="/var/lib/kubelet/pods/f9301966-9820-4623-8393-f185a0616743/volumes" Feb 27 00:42:12 crc kubenswrapper[4781]: I0227 00:42:12.894989 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:42:12 crc kubenswrapper[4781]: I0227 00:42:12.895501 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.125602 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:20 crc kubenswrapper[4781]: E0227 00:42:20.126796 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.126812 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.127034 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" containerName="oc" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.128524 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.142737 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167036 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.167569 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270262 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.270800 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.271018 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.290410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"community-operators-j5kdg\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:20 crc kubenswrapper[4781]: I0227 00:42:20.449382 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.008237 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.277965 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5" exitCode=0 Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.278083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5"} Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.278299 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"c217e1db82494a5ce8a1988d7b8a9301ae905a7ad5c32fec2877aa3b52e831a0"} Feb 27 00:42:21 crc kubenswrapper[4781]: I0227 00:42:21.280641 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:42:22 crc kubenswrapper[4781]: I0227 00:42:22.291093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493"} Feb 27 00:42:24 crc kubenswrapper[4781]: I0227 00:42:24.321716 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493" exitCode=0 Feb 27 00:42:24 crc kubenswrapper[4781]: I0227 00:42:24.322346 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493"} Feb 27 00:42:26 crc kubenswrapper[4781]: I0227 00:42:26.343045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerStarted","Data":"4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e"} Feb 27 00:42:26 crc kubenswrapper[4781]: I0227 00:42:26.361762 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5kdg" podStartSLOduration=2.678562532 podStartE2EDuration="6.361741623s" podCreationTimestamp="2026-02-27 00:42:20 +0000 UTC" firstStartedPulling="2026-02-27 00:42:21.280386289 +0000 UTC m=+2210.537925843" lastFinishedPulling="2026-02-27 00:42:24.96356538 +0000 UTC m=+2214.221104934" observedRunningTime="2026-02-27 00:42:26.359113734 +0000 UTC m=+2215.616653298" watchObservedRunningTime="2026-02-27 00:42:26.361741623 +0000 UTC m=+2215.619281177" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.450268 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.452428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.501509 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:30 crc kubenswrapper[4781]: I0227 00:42:30.905382 4781 scope.go:117] "RemoveContainer" containerID="53c40723095bbd1b6e5cbec68ec5b0fac1a46ad7d3ad91a7ae622222a7ca48d5" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.007328 4781 scope.go:117] "RemoveContainer" containerID="4c15c466d7915dc653aadd3dff0e84b4a8fd3f49a7805b84c66c98b2891abd65" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.435623 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:31 crc kubenswrapper[4781]: I0227 00:42:31.813114 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:33 crc kubenswrapper[4781]: I0227 00:42:33.405225 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5kdg" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" containerID="cri-o://4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" gracePeriod=2 Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.422010 4781 generic.go:334] "Generic (PLEG): container finished" podID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerID="4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" exitCode=0 Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.422145 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e"} Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.525940 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.690415 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") pod \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\" (UID: \"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549\") " Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.692408 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities" (OuterVolumeSpecName: "utilities") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.707785 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb" (OuterVolumeSpecName: "kube-api-access-lpsmb") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "kube-api-access-lpsmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.761011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" (UID: "3bd52a8a-5bfd-45f3-8d26-90dcc81f2549"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792397 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792434 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:34 crc kubenswrapper[4781]: I0227 00:42:34.792447 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpsmb\" (UniqueName: \"kubernetes.io/projected/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549-kube-api-access-lpsmb\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.435583 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5kdg" event={"ID":"3bd52a8a-5bfd-45f3-8d26-90dcc81f2549","Type":"ContainerDied","Data":"c217e1db82494a5ce8a1988d7b8a9301ae905a7ad5c32fec2877aa3b52e831a0"} Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.435674 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5kdg" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.436331 4781 scope.go:117] "RemoveContainer" containerID="4572cbff2debbe399dc6df59ffc2c15b2ea56016116f3d018932b0bcc0eae69e" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.462442 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.469684 4781 scope.go:117] "RemoveContainer" containerID="40593e6c39e318c4bc42b772884d5b14e6880366dad393e45d23642abc403493" Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.471997 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5kdg"] Feb 27 00:42:35 crc kubenswrapper[4781]: I0227 00:42:35.488408 4781 scope.go:117] "RemoveContainer" containerID="2fbf61290fa4aee7b5f7d7ed2e1a6d6a175da2775967ab4c314c14bc7cf150d5" Feb 27 00:42:37 crc kubenswrapper[4781]: I0227 00:42:37.320005 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" path="/var/lib/kubelet/pods/3bd52a8a-5bfd-45f3-8d26-90dcc81f2549/volumes" Feb 27 00:42:42 crc kubenswrapper[4781]: I0227 00:42:42.896074 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:42:42 crc kubenswrapper[4781]: I0227 00:42:42.896580 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:42:49 crc kubenswrapper[4781]: I0227 00:42:49.592724 4781 generic.go:334] "Generic (PLEG): container finished" podID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerID="e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f" exitCode=0 Feb 27 00:42:49 crc kubenswrapper[4781]: I0227 00:42:49.592821 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerDied","Data":"e8eaf2ec603b6edc8badc09564a9675ccb658970cd78310fb0d45ee49918516f"} Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.153235 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282922 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.282940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\" (UID: \"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87\") " Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.289065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w" (OuterVolumeSpecName: "kube-api-access-ntw6w") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "kube-api-access-ntw6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.290533 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.319028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.319196 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.327447 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.327857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory" (OuterVolumeSpecName: "inventory") pod "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" (UID: "3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386903 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386938 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntw6w\" (UniqueName: \"kubernetes.io/projected/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-kube-api-access-ntw6w\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386971 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386984 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.386994 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.387004 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.622413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" event={"ID":"3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87","Type":"ContainerDied","Data":"e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2"} Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.622470 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e006780cf4daccda68c6b25b213909e5856f96dd7fd97d562673bc7d27726bb2" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.623726 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.711350 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712003 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-utilities" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712018 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-utilities" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712038 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712045 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712053 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: E0227 00:42:51.712075 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-content" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712081 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="extract-content" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712239 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd52a8a-5bfd-45f3-8d26-90dcc81f2549" containerName="registry-server" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712265 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.712973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.715201 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.715277 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.716998 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.717155 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.717181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.719973 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795361 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.795545 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.896937 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.897185 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901342 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.901828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.902033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:51 crc kubenswrapper[4781]: I0227 00:42:51.918779 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.029974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.603429 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c"] Feb 27 00:42:52 crc kubenswrapper[4781]: I0227 00:42:52.632332 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerStarted","Data":"c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f"} Feb 27 00:42:53 crc kubenswrapper[4781]: I0227 00:42:53.645782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerStarted","Data":"82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb"} Feb 27 00:42:53 crc kubenswrapper[4781]: I0227 00:42:53.665746 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" podStartSLOduration=2.241658892 podStartE2EDuration="2.665725931s" podCreationTimestamp="2026-02-27 00:42:51 +0000 UTC" firstStartedPulling="2026-02-27 00:42:52.607098864 +0000 UTC m=+2241.864638418" lastFinishedPulling="2026-02-27 00:42:53.031165873 +0000 UTC m=+2242.288705457" observedRunningTime="2026-02-27 00:42:53.664841998 +0000 UTC m=+2242.922381562" watchObservedRunningTime="2026-02-27 00:42:53.665725931 +0000 UTC m=+2242.923265485" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.895380 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.896270 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.896319 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.897052 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:43:12 crc kubenswrapper[4781]: I0227 00:43:12.897114 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" gracePeriod=600 Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.843756 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" exitCode=0 Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.843828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37"} Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.844669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} Feb 27 00:43:13 crc kubenswrapper[4781]: I0227 00:43:13.844702 4781 scope.go:117] "RemoveContainer" containerID="ee4b810ba478d861e826700cc46b741a655da3ef3281705fca3ad5d7a04f955f" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.159824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.161803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.164977 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.165015 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.165136 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.177739 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.233024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.335263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.353107 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"auto-csr-approver-29535884-8t2lb\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.486110 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:00 crc kubenswrapper[4781]: I0227 00:44:00.968398 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:44:01 crc kubenswrapper[4781]: I0227 00:44:01.307764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerStarted","Data":"f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd"} Feb 27 00:44:02 crc kubenswrapper[4781]: I0227 00:44:02.318704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerStarted","Data":"26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b"} Feb 27 00:44:02 crc kubenswrapper[4781]: I0227 00:44:02.342770 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" podStartSLOduration=1.394038211 podStartE2EDuration="2.342751616s" podCreationTimestamp="2026-02-27 00:44:00 +0000 UTC" firstStartedPulling="2026-02-27 00:44:00.969554041 +0000 UTC m=+2310.227093595" lastFinishedPulling="2026-02-27 00:44:01.918267446 +0000 UTC m=+2311.175807000" observedRunningTime="2026-02-27 00:44:02.335250429 +0000 UTC m=+2311.592789983" watchObservedRunningTime="2026-02-27 00:44:02.342751616 +0000 UTC m=+2311.600291170" Feb 27 00:44:03 crc kubenswrapper[4781]: I0227 00:44:03.356345 4781 generic.go:334] "Generic (PLEG): container finished" podID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerID="26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b" exitCode=0 Feb 27 00:44:03 crc kubenswrapper[4781]: I0227 00:44:03.357735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerDied","Data":"26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b"} Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.833692 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.933216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") pod \"018f4ff5-f081-4257-8189-3eb14ea035f3\" (UID: \"018f4ff5-f081-4257-8189-3eb14ea035f3\") " Feb 27 00:44:04 crc kubenswrapper[4781]: I0227 00:44:04.939757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw" (OuterVolumeSpecName: "kube-api-access-8xndw") pod "018f4ff5-f081-4257-8189-3eb14ea035f3" (UID: "018f4ff5-f081-4257-8189-3eb14ea035f3"). InnerVolumeSpecName "kube-api-access-8xndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.035480 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xndw\" (UniqueName: \"kubernetes.io/projected/018f4ff5-f081-4257-8189-3eb14ea035f3-kube-api-access-8xndw\") on node \"crc\" DevicePath \"\"" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376439 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" event={"ID":"018f4ff5-f081-4257-8189-3eb14ea035f3","Type":"ContainerDied","Data":"f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd"} Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376477 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7470c5bf777d76181cd4c7a6803e2c6ed79b2d14788346a445f0ea22ee049cd" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.376519 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535884-8t2lb" Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.422367 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:44:05 crc kubenswrapper[4781]: I0227 00:44:05.431648 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535878-49z87"] Feb 27 00:44:07 crc kubenswrapper[4781]: I0227 00:44:07.324954 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63206fe-04b3-4f07-a4cb-f8fd89645931" path="/var/lib/kubelet/pods/b63206fe-04b3-4f07-a4cb-f8fd89645931/volumes" Feb 27 00:44:31 crc kubenswrapper[4781]: I0227 00:44:31.177980 4781 scope.go:117] "RemoveContainer" containerID="e098a22e98e83ab04db629aad7e6384885fe2b771dad33544e78c6562872ae4e" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.149858 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:00 crc kubenswrapper[4781]: E0227 00:45:00.150785 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.150798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.151000 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" containerName="oc" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.151842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.154290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.154381 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.163250 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.179867 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.179983 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.180085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.281795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.282581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.287014 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.298580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"collect-profiles-29535885-nfxm2\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.475778 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:00 crc kubenswrapper[4781]: I0227 00:45:00.925986 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.884545 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerID="13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb" exitCode=0 Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.884762 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerDied","Data":"13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb"} Feb 27 00:45:01 crc kubenswrapper[4781]: I0227 00:45:01.885262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerStarted","Data":"d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967"} Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.319842 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443444 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443656 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.443717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") pod \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\" (UID: \"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db\") " Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.444458 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.449388 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d" (OuterVolumeSpecName: "kube-api-access-l4l5d") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "kube-api-access-l4l5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.449800 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" (UID: "a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546289 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546327 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4l5d\" (UniqueName: \"kubernetes.io/projected/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-kube-api-access-l4l5d\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.546338 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" event={"ID":"a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db","Type":"ContainerDied","Data":"d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967"} Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916741 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25cc4cd235d26dd009c8b84749f6ad9cdbca5cf44724fba98ad630d5bb5c967" Feb 27 00:45:03 crc kubenswrapper[4781]: I0227 00:45:03.916790 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2" Feb 27 00:45:04 crc kubenswrapper[4781]: I0227 00:45:04.391045 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:45:04 crc kubenswrapper[4781]: I0227 00:45:04.400224 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535840-tfxxm"] Feb 27 00:45:05 crc kubenswrapper[4781]: I0227 00:45:05.324134 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678f27fc-d210-4a4f-bd73-090378740da9" path="/var/lib/kubelet/pods/678f27fc-d210-4a4f-bd73-090378740da9/volumes" Feb 27 00:45:31 crc kubenswrapper[4781]: I0227 00:45:31.254825 4781 scope.go:117] "RemoveContainer" containerID="898ccef1da25e7c00fcd11040419fe4b505ada16cb26d62d9a4806872cb68348" Feb 27 00:45:42 crc kubenswrapper[4781]: I0227 00:45:42.894994 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:45:42 crc kubenswrapper[4781]: I0227 00:45:42.895802 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.172007 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:00 crc kubenswrapper[4781]: E0227 00:46:00.175392 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.175445 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.176052 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" containerName="collect-profiles" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.177237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.182941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.183733 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.184437 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.204223 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.312701 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.414360 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.433012 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"auto-csr-approver-29535886-5khtq\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.522213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:00 crc kubenswrapper[4781]: I0227 00:46:00.978061 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:46:01 crc kubenswrapper[4781]: I0227 00:46:01.481557 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerStarted","Data":"088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956"} Feb 27 00:46:02 crc kubenswrapper[4781]: I0227 00:46:02.492452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerStarted","Data":"e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6"} Feb 27 00:46:02 crc kubenswrapper[4781]: I0227 00:46:02.517348 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535886-5khtq" podStartSLOduration=1.455489765 podStartE2EDuration="2.517331706s" podCreationTimestamp="2026-02-27 00:46:00 +0000 UTC" firstStartedPulling="2026-02-27 00:46:00.989729086 +0000 UTC m=+2430.247268640" lastFinishedPulling="2026-02-27 00:46:02.051571027 +0000 UTC m=+2431.309110581" observedRunningTime="2026-02-27 00:46:02.511103672 +0000 UTC m=+2431.768643226" watchObservedRunningTime="2026-02-27 00:46:02.517331706 +0000 UTC m=+2431.774871260" Feb 27 00:46:03 crc kubenswrapper[4781]: I0227 00:46:03.501695 4781 generic.go:334] "Generic (PLEG): container finished" podID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerID="e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6" exitCode=0 Feb 27 00:46:03 crc kubenswrapper[4781]: I0227 00:46:03.501768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerDied","Data":"e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6"} Feb 27 00:46:04 crc kubenswrapper[4781]: I0227 00:46:04.904082 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.012268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") pod \"8143ddb0-990c-4f1e-9130-7ca30776e64b\" (UID: \"8143ddb0-990c-4f1e-9130-7ca30776e64b\") " Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.018990 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4" (OuterVolumeSpecName: "kube-api-access-m52c4") pod "8143ddb0-990c-4f1e-9130-7ca30776e64b" (UID: "8143ddb0-990c-4f1e-9130-7ca30776e64b"). InnerVolumeSpecName "kube-api-access-m52c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.115587 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m52c4\" (UniqueName: \"kubernetes.io/projected/8143ddb0-990c-4f1e-9130-7ca30776e64b-kube-api-access-m52c4\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535886-5khtq" event={"ID":"8143ddb0-990c-4f1e-9130-7ca30776e64b","Type":"ContainerDied","Data":"088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956"} Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528069 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="088040718b7f1e23e576481b5b67af8b3f210dfca038e331cb9c81b5567e1956" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.528079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535886-5khtq" Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.588372 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:46:05 crc kubenswrapper[4781]: I0227 00:46:05.597487 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535880-9cpwk"] Feb 27 00:46:07 crc kubenswrapper[4781]: I0227 00:46:07.321913 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fc175b-7238-41ec-91f7-17cc07188100" path="/var/lib/kubelet/pods/93fc175b-7238-41ec-91f7-17cc07188100/volumes" Feb 27 00:46:12 crc kubenswrapper[4781]: I0227 00:46:12.895323 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:46:12 crc kubenswrapper[4781]: I0227 00:46:12.896882 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:30 crc kubenswrapper[4781]: I0227 00:46:30.765173 4781 generic.go:334] "Generic (PLEG): container finished" podID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerID="82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb" exitCode=0 Feb 27 00:46:30 crc kubenswrapper[4781]: I0227 00:46:30.765266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerDied","Data":"82f87db0afeb37c294b7dd4a8934c5d99082b1d59480c43a23f358b6efcac0cb"} Feb 27 00:46:31 crc kubenswrapper[4781]: I0227 00:46:31.354956 4781 scope.go:117] "RemoveContainer" containerID="f95b25c7f6b69f37212289ff6ccaf1c8b693e043eb0635c23ef340ef5632fb12" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.322375 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.395787 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.396501 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") pod \"bd292468-b151-4004-b0b7-bd873e7e4e2d\" (UID: \"bd292468-b151-4004-b0b7-bd873e7e4e2d\") " Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.403480 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.419047 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl" (OuterVolumeSpecName: "kube-api-access-np7xl") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "kube-api-access-np7xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.427140 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.432123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.437778 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory" (OuterVolumeSpecName: "inventory") pod "bd292468-b151-4004-b0b7-bd873e7e4e2d" (UID: "bd292468-b151-4004-b0b7-bd873e7e4e2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499361 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499400 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7xl\" (UniqueName: \"kubernetes.io/projected/bd292468-b151-4004-b0b7-bd873e7e4e2d-kube-api-access-np7xl\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499413 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499423 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.499434 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bd292468-b151-4004-b0b7-bd873e7e4e2d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" event={"ID":"bd292468-b151-4004-b0b7-bd873e7e4e2d","Type":"ContainerDied","Data":"c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f"} Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787673 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c119cf35418cf9a52f75fa4eac36439312f59759c419c8f80f423d37df05fd2f" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.787669 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880380 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:32 crc kubenswrapper[4781]: E0227 00:46:32.880818 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880835 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: E0227 00:46:32.880849 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.880856 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881070 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" containerName="oc" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881090 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd292468-b151-4004-b0b7-bd873e7e4e2d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.881795 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.885443 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.885722 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.886914 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887257 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887456 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.887732 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.889363 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.892710 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.909895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.909971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910362 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910460 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910494 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910524 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:32 crc kubenswrapper[4781]: I0227 00:46:32.910795 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013622 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013704 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.013801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.016978 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018547 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.018841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.019850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.021295 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.021471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.022713 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.042390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ntt4h\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:33 crc kubenswrapper[4781]: I0227 00:46:33.214184 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:46:34 crc kubenswrapper[4781]: I0227 00:46:34.597106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h"] Feb 27 00:46:34 crc kubenswrapper[4781]: I0227 00:46:34.807181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerStarted","Data":"0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe"} Feb 27 00:46:35 crc kubenswrapper[4781]: I0227 00:46:35.818972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerStarted","Data":"d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c"} Feb 27 00:46:35 crc kubenswrapper[4781]: I0227 00:46:35.843257 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" podStartSLOduration=3.425861555 podStartE2EDuration="3.8432386s" podCreationTimestamp="2026-02-27 00:46:32 +0000 UTC" firstStartedPulling="2026-02-27 00:46:34.59959932 +0000 UTC m=+2463.857138874" lastFinishedPulling="2026-02-27 00:46:35.016976365 +0000 UTC m=+2464.274515919" observedRunningTime="2026-02-27 00:46:35.835396904 +0000 UTC m=+2465.092936478" watchObservedRunningTime="2026-02-27 00:46:35.8432386 +0000 UTC m=+2465.100778154" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.895933 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.896564 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.896605 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.897353 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:46:42 crc kubenswrapper[4781]: I0227 00:46:42.897396 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" gracePeriod=600 Feb 27 00:46:43 crc kubenswrapper[4781]: E0227 00:46:43.027793 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.893613 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" exitCode=0 Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.893764 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084"} Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.894070 4781 scope.go:117] "RemoveContainer" containerID="4da315c4c7bf218d380bca00c0ade3ee72457fd61b27366edc67ffcf85618e37" Feb 27 00:46:43 crc kubenswrapper[4781]: I0227 00:46:43.895770 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:46:43 crc kubenswrapper[4781]: E0227 00:46:43.896269 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:46:55 crc kubenswrapper[4781]: I0227 00:46:55.310785 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:46:55 crc kubenswrapper[4781]: E0227 00:46:55.312096 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:10 crc kubenswrapper[4781]: I0227 00:47:10.309352 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:10 crc kubenswrapper[4781]: E0227 00:47:10.310277 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:22 crc kubenswrapper[4781]: I0227 00:47:22.309821 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:22 crc kubenswrapper[4781]: E0227 00:47:22.310512 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:34 crc kubenswrapper[4781]: I0227 00:47:34.309291 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:34 crc kubenswrapper[4781]: E0227 00:47:34.311021 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:47:46 crc kubenswrapper[4781]: I0227 00:47:46.310030 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:47:46 crc kubenswrapper[4781]: E0227 00:47:46.312568 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.149946 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.151817 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155308 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.155456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.172924 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.194848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.299272 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.313812 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:00 crc kubenswrapper[4781]: E0227 00:48:00.314061 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.327545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"auto-csr-approver-29535888-nb28f\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.473990 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.958679 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:48:00 crc kubenswrapper[4781]: I0227 00:48:00.962238 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:48:01 crc kubenswrapper[4781]: I0227 00:48:01.621209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerStarted","Data":"a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6"} Feb 27 00:48:02 crc kubenswrapper[4781]: I0227 00:48:02.632492 4781 generic.go:334] "Generic (PLEG): container finished" podID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerID="2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042" exitCode=0 Feb 27 00:48:02 crc kubenswrapper[4781]: I0227 00:48:02.632558 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerDied","Data":"2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042"} Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.104359 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.191128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") pod \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\" (UID: \"231f1edd-305c-4a6c-bd4e-11c12c2ae515\") " Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.209565 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24" (OuterVolumeSpecName: "kube-api-access-9lh24") pod "231f1edd-305c-4a6c-bd4e-11c12c2ae515" (UID: "231f1edd-305c-4a6c-bd4e-11c12c2ae515"). InnerVolumeSpecName "kube-api-access-9lh24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.293485 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lh24\" (UniqueName: \"kubernetes.io/projected/231f1edd-305c-4a6c-bd4e-11c12c2ae515-kube-api-access-9lh24\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.653897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535888-nb28f" event={"ID":"231f1edd-305c-4a6c-bd4e-11c12c2ae515","Type":"ContainerDied","Data":"a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6"} Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.653952 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c41d90becd6a07f952648077ccd76199df70e8b044cb971027324554b510b6" Feb 27 00:48:04 crc kubenswrapper[4781]: I0227 00:48:04.654013 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535888-nb28f" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.187870 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.201878 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535882-skl65"] Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.321287 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29db339c-88ad-410b-bad1-e5f5328e9a0a" path="/var/lib/kubelet/pods/29db339c-88ad-410b-bad1-e5f5328e9a0a/volumes" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.887512 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:05 crc kubenswrapper[4781]: E0227 00:48:05.888085 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.888108 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.888346 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" containerName="oc" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.890521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:05 crc kubenswrapper[4781]: I0227 00:48:05.899876 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.026939 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.129926 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.149435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"redhat-marketplace-vpfn4\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.211338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:06 crc kubenswrapper[4781]: W0227 00:48:06.756147 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577f63dd_8b20_434a_ae9b_3d9589f08ccf.slice/crio-b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057 WatchSource:0}: Error finding container b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057: Status 404 returned error can't find the container with id b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057 Feb 27 00:48:06 crc kubenswrapper[4781]: I0227 00:48:06.758773 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687459 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" exitCode=0 Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687853 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260"} Feb 27 00:48:07 crc kubenswrapper[4781]: I0227 00:48:07.687885 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerStarted","Data":"b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057"} Feb 27 00:48:09 crc kubenswrapper[4781]: I0227 00:48:09.715504 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" exitCode=0 Feb 27 00:48:09 crc kubenswrapper[4781]: I0227 00:48:09.715598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c"} Feb 27 00:48:10 crc kubenswrapper[4781]: I0227 00:48:10.726490 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerStarted","Data":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} Feb 27 00:48:10 crc kubenswrapper[4781]: I0227 00:48:10.750248 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vpfn4" podStartSLOduration=3.333062849 podStartE2EDuration="5.750224573s" podCreationTimestamp="2026-02-27 00:48:05 +0000 UTC" firstStartedPulling="2026-02-27 00:48:07.694047167 +0000 UTC m=+2556.951589661" lastFinishedPulling="2026-02-27 00:48:10.111211831 +0000 UTC m=+2559.368751385" observedRunningTime="2026-02-27 00:48:10.745484058 +0000 UTC m=+2560.003023622" watchObservedRunningTime="2026-02-27 00:48:10.750224573 +0000 UTC m=+2560.007764127" Feb 27 00:48:14 crc kubenswrapper[4781]: I0227 00:48:14.309575 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:14 crc kubenswrapper[4781]: E0227 00:48:14.310536 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.212131 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.212492 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.264747 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.825651 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:16 crc kubenswrapper[4781]: I0227 00:48:16.873114 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:18 crc kubenswrapper[4781]: I0227 00:48:18.798579 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vpfn4" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" containerID="cri-o://7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" gracePeriod=2 Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.562993 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640411 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.640709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") pod \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\" (UID: \"577f63dd-8b20-434a-ae9b-3d9589f08ccf\") " Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.642401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities" (OuterVolumeSpecName: "utilities") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.651564 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp" (OuterVolumeSpecName: "kube-api-access-f97rp") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "kube-api-access-f97rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.668274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "577f63dd-8b20-434a-ae9b-3d9589f08ccf" (UID: "577f63dd-8b20-434a-ae9b-3d9589f08ccf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742924 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742963 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f97rp\" (UniqueName: \"kubernetes.io/projected/577f63dd-8b20-434a-ae9b-3d9589f08ccf-kube-api-access-f97rp\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.742978 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/577f63dd-8b20-434a-ae9b-3d9589f08ccf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811505 4781 generic.go:334] "Generic (PLEG): container finished" podID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" exitCode=0 Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811595 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpfn4" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpfn4" event={"ID":"577f63dd-8b20-434a-ae9b-3d9589f08ccf","Type":"ContainerDied","Data":"b9571a6077784c10dc10f222b8867d8ccc892e934abbc12de5aadb40b3104057"} Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.811653 4781 scope.go:117] "RemoveContainer" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.834002 4781 scope.go:117] "RemoveContainer" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.864330 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.881288 4781 scope.go:117] "RemoveContainer" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.883080 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpfn4"] Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.925960 4781 scope.go:117] "RemoveContainer" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.926477 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": container with ID starting with 7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac not found: ID does not exist" containerID="7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926507 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac"} err="failed to get container status \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": rpc error: code = NotFound desc = could not find container \"7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac\": container with ID starting with 7ff44f6bb108c843764b83e6f50f916331b2bd5212f51f1b327cfbfa363af4ac not found: ID does not exist" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926533 4781 scope.go:117] "RemoveContainer" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.926870 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": container with ID starting with 66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c not found: ID does not exist" containerID="66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926920 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c"} err="failed to get container status \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": rpc error: code = NotFound desc = could not find container \"66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c\": container with ID starting with 66c72fab5346523d3d958a5d39abc7e4f83ebc4364388cdb9b8413208fb7222c not found: ID does not exist" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.926951 4781 scope.go:117] "RemoveContainer" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: E0227 00:48:19.927302 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": container with ID starting with bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260 not found: ID does not exist" containerID="bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260" Feb 27 00:48:19 crc kubenswrapper[4781]: I0227 00:48:19.927368 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260"} err="failed to get container status \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": rpc error: code = NotFound desc = could not find container \"bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260\": container with ID starting with bf69956ba766952d1ef815cca972db518efdc39024e3860a55ea5727a029f260 not found: ID does not exist" Feb 27 00:48:21 crc kubenswrapper[4781]: I0227 00:48:21.320514 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" path="/var/lib/kubelet/pods/577f63dd-8b20-434a-ae9b-3d9589f08ccf/volumes" Feb 27 00:48:28 crc kubenswrapper[4781]: I0227 00:48:28.311280 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:28 crc kubenswrapper[4781]: E0227 00:48:28.312919 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:31 crc kubenswrapper[4781]: I0227 00:48:31.482681 4781 scope.go:117] "RemoveContainer" containerID="bcc82c4ff93196fe9d1d81964a39e384053e68533a13a500ed58309dd14ee8eb" Feb 27 00:48:39 crc kubenswrapper[4781]: I0227 00:48:39.309651 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:39 crc kubenswrapper[4781]: E0227 00:48:39.310433 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:48:42 crc kubenswrapper[4781]: I0227 00:48:42.224266 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerID="d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c" exitCode=0 Feb 27 00:48:42 crc kubenswrapper[4781]: I0227 00:48:42.224327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerDied","Data":"d3e6c31e59c8273a4822b6ba92413f35b26f4d3e1b11014494798bec77bd763c"} Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.803310 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.899950 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900024 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900110 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900214 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900335 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900556 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.900608 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") pod \"d3f8abc3-17b4-4d88-890e-85304a100a97\" (UID: \"d3f8abc3-17b4-4d88-890e-85304a100a97\") " Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.912097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5" (OuterVolumeSpecName: "kube-api-access-n7qk5") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "kube-api-access-n7qk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.933982 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.944197 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.948183 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.955846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.957916 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.958403 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.971544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory" (OuterVolumeSpecName: "inventory") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.981454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.981506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 00:48:43 crc kubenswrapper[4781]: I0227 00:48:43.982846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d3f8abc3-17b4-4d88-890e-85304a100a97" (UID: "d3f8abc3-17b4-4d88-890e-85304a100a97"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002923 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002973 4781 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002987 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.002999 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003012 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003024 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003036 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003049 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003063 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7qk5\" (UniqueName: \"kubernetes.io/projected/d3f8abc3-17b4-4d88-890e-85304a100a97-kube-api-access-n7qk5\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003075 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.003087 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3f8abc3-17b4-4d88-890e-85304a100a97-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246337 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" event={"ID":"d3f8abc3-17b4-4d88-890e-85304a100a97","Type":"ContainerDied","Data":"0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe"} Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246704 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f40d07d67261eda4bba8df0dd754f507383635699da4a2039a40542ef874ffe" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.246424 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ntt4h" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.357847 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358295 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-content" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358319 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-content" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358341 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-utilities" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358347 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="extract-utilities" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358383 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: E0227 00:48:44.358395 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358401 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358578 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f63dd-8b20-434a-ae9b-3d9589f08ccf" containerName="registry-server" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.358594 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f8abc3-17b4-4d88-890e-85304a100a97" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.359959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.361856 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.362027 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mvxs7" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.365370 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.365898 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.366127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.378477 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410787 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.410891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411064 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411241 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411455 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.411598 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.515095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.515962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516115 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516151 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.516220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521211 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521225 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521284 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.521785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.532074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.537262 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:44 crc kubenswrapper[4781]: I0227 00:48:44.679868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:48:45 crc kubenswrapper[4781]: I0227 00:48:45.227927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs"] Feb 27 00:48:45 crc kubenswrapper[4781]: I0227 00:48:45.257510 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerStarted","Data":"c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834"} Feb 27 00:48:46 crc kubenswrapper[4781]: I0227 00:48:46.267700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerStarted","Data":"892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae"} Feb 27 00:48:46 crc kubenswrapper[4781]: I0227 00:48:46.286466 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" podStartSLOduration=1.7288784750000001 podStartE2EDuration="2.286450882s" podCreationTimestamp="2026-02-27 00:48:44 +0000 UTC" firstStartedPulling="2026-02-27 00:48:45.230210819 +0000 UTC m=+2594.487750373" lastFinishedPulling="2026-02-27 00:48:45.787783236 +0000 UTC m=+2595.045322780" observedRunningTime="2026-02-27 00:48:46.283196867 +0000 UTC m=+2595.540736431" watchObservedRunningTime="2026-02-27 00:48:46.286450882 +0000 UTC m=+2595.543990436" Feb 27 00:48:51 crc kubenswrapper[4781]: I0227 00:48:51.316446 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:48:51 crc kubenswrapper[4781]: E0227 00:48:51.317273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:06 crc kubenswrapper[4781]: I0227 00:49:06.309999 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:06 crc kubenswrapper[4781]: E0227 00:49:06.311017 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:21 crc kubenswrapper[4781]: I0227 00:49:21.316763 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:21 crc kubenswrapper[4781]: E0227 00:49:21.317660 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:35 crc kubenswrapper[4781]: I0227 00:49:35.309476 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:35 crc kubenswrapper[4781]: E0227 00:49:35.310293 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:49:47 crc kubenswrapper[4781]: I0227 00:49:47.309366 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:49:47 crc kubenswrapper[4781]: E0227 00:49:47.310162 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.145257 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.147271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150364 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150615 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.150765 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.155959 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.267445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.309539 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:00 crc kubenswrapper[4781]: E0227 00:50:00.309930 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.370216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.391453 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"auto-csr-approver-29535890-nv49g\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.477419 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.985843 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:50:00 crc kubenswrapper[4781]: I0227 00:50:00.997468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerStarted","Data":"c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf"} Feb 27 00:50:03 crc kubenswrapper[4781]: I0227 00:50:03.018232 4781 generic.go:334] "Generic (PLEG): container finished" podID="88fd3abb-2996-49d0-851b-41e0040438fa" containerID="f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b" exitCode=0 Feb 27 00:50:03 crc kubenswrapper[4781]: I0227 00:50:03.018309 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerDied","Data":"f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b"} Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.430616 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.561548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") pod \"88fd3abb-2996-49d0-851b-41e0040438fa\" (UID: \"88fd3abb-2996-49d0-851b-41e0040438fa\") " Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.567374 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t" (OuterVolumeSpecName: "kube-api-access-4t45t") pod "88fd3abb-2996-49d0-851b-41e0040438fa" (UID: "88fd3abb-2996-49d0-851b-41e0040438fa"). InnerVolumeSpecName "kube-api-access-4t45t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:04 crc kubenswrapper[4781]: I0227 00:50:04.664224 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t45t\" (UniqueName: \"kubernetes.io/projected/88fd3abb-2996-49d0-851b-41e0040438fa-kube-api-access-4t45t\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041507 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535890-nv49g" event={"ID":"88fd3abb-2996-49d0-851b-41e0040438fa","Type":"ContainerDied","Data":"c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf"} Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041553 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7b350d60c82e6d7b71728768683583c947111cd6ca7fe9cb3ff82449ef63dbf" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.041611 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535890-nv49g" Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.505284 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:50:05 crc kubenswrapper[4781]: I0227 00:50:05.515428 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535884-8t2lb"] Feb 27 00:50:07 crc kubenswrapper[4781]: I0227 00:50:07.319949 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018f4ff5-f081-4257-8189-3eb14ea035f3" path="/var/lib/kubelet/pods/018f4ff5-f081-4257-8189-3eb14ea035f3/volumes" Feb 27 00:50:13 crc kubenswrapper[4781]: I0227 00:50:13.311882 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:13 crc kubenswrapper[4781]: E0227 00:50:13.329365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.507088 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:24 crc kubenswrapper[4781]: E0227 00:50:24.509315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.509413 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.509795 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" containerName="oc" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.511884 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.522431 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.585784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688320 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.688441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.689112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.689147 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.711096 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"certified-operators-qswm5\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:24 crc kubenswrapper[4781]: I0227 00:50:24.839197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:25 crc kubenswrapper[4781]: I0227 00:50:25.313201 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:25 crc kubenswrapper[4781]: E0227 00:50:25.314117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:25 crc kubenswrapper[4781]: I0227 00:50:25.380982 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.251805 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" exitCode=0 Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.251946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74"} Feb 27 00:50:26 crc kubenswrapper[4781]: I0227 00:50:26.252228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"ec42b700956099171afc0f35a3868f09a1f40ac9aa51906b323a60c042870bbe"} Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.697340 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.706425 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.712229 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.856953 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.959273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960067 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.960926 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:27 crc kubenswrapper[4781]: I0227 00:50:27.986341 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"redhat-operators-xlssc\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.028883 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.274926 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} Feb 27 00:50:28 crc kubenswrapper[4781]: I0227 00:50:28.564899 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:28 crc kubenswrapper[4781]: W0227 00:50:28.570027 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4ba98e_f6f5_41cc_8618_22dfb8700b4c.slice/crio-556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7 WatchSource:0}: Error finding container 556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7: Status 404 returned error can't find the container with id 556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7 Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286560 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" exitCode=0 Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613"} Feb 27 00:50:29 crc kubenswrapper[4781]: I0227 00:50:29.286996 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7"} Feb 27 00:50:30 crc kubenswrapper[4781]: I0227 00:50:30.299595 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" exitCode=0 Feb 27 00:50:30 crc kubenswrapper[4781]: I0227 00:50:30.299674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.323959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerStarted","Data":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.324275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.363346 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qswm5" podStartSLOduration=2.875918501 podStartE2EDuration="7.363327507s" podCreationTimestamp="2026-02-27 00:50:24 +0000 UTC" firstStartedPulling="2026-02-27 00:50:26.25408046 +0000 UTC m=+2695.511620014" lastFinishedPulling="2026-02-27 00:50:30.741489466 +0000 UTC m=+2699.999029020" observedRunningTime="2026-02-27 00:50:31.35700478 +0000 UTC m=+2700.614544334" watchObservedRunningTime="2026-02-27 00:50:31.363327507 +0000 UTC m=+2700.620867061" Feb 27 00:50:31 crc kubenswrapper[4781]: I0227 00:50:31.588831 4781 scope.go:117] "RemoveContainer" containerID="26e013582f5ee2e314ebc2f4329b87db88bd3251fee9e3e932b5b02ee387f73b" Feb 27 00:50:34 crc kubenswrapper[4781]: I0227 00:50:34.839649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:34 crc kubenswrapper[4781]: I0227 00:50:34.840398 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:35 crc kubenswrapper[4781]: I0227 00:50:35.897936 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:35 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:35 crc kubenswrapper[4781]: > Feb 27 00:50:36 crc kubenswrapper[4781]: I0227 00:50:36.371485 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" exitCode=0 Feb 27 00:50:36 crc kubenswrapper[4781]: I0227 00:50:36.371534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} Feb 27 00:50:37 crc kubenswrapper[4781]: I0227 00:50:37.407086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerStarted","Data":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} Feb 27 00:50:37 crc kubenswrapper[4781]: I0227 00:50:37.433451 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xlssc" podStartSLOduration=2.865682895 podStartE2EDuration="10.433430393s" podCreationTimestamp="2026-02-27 00:50:27 +0000 UTC" firstStartedPulling="2026-02-27 00:50:29.288601725 +0000 UTC m=+2698.546141279" lastFinishedPulling="2026-02-27 00:50:36.856349233 +0000 UTC m=+2706.113888777" observedRunningTime="2026-02-27 00:50:37.427495607 +0000 UTC m=+2706.685035181" watchObservedRunningTime="2026-02-27 00:50:37.433430393 +0000 UTC m=+2706.690969947" Feb 27 00:50:38 crc kubenswrapper[4781]: I0227 00:50:38.029995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:38 crc kubenswrapper[4781]: I0227 00:50:38.030234 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:39 crc kubenswrapper[4781]: I0227 00:50:39.080089 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xlssc" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:39 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:39 crc kubenswrapper[4781]: > Feb 27 00:50:39 crc kubenswrapper[4781]: I0227 00:50:39.310189 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:39 crc kubenswrapper[4781]: E0227 00:50:39.310542 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:45 crc kubenswrapper[4781]: I0227 00:50:45.894064 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" probeResult="failure" output=< Feb 27 00:50:45 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 00:50:45 crc kubenswrapper[4781]: > Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.076789 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.129812 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:48 crc kubenswrapper[4781]: I0227 00:50:48.312664 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:49 crc kubenswrapper[4781]: I0227 00:50:49.510167 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xlssc" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" containerID="cri-o://137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" gracePeriod=2 Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.133401 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264523 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264566 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.264678 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") pod \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\" (UID: \"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c\") " Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.265528 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities" (OuterVolumeSpecName: "utilities") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.272818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47" (OuterVolumeSpecName: "kube-api-access-pcb47") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "kube-api-access-pcb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.367214 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.367252 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcb47\" (UniqueName: \"kubernetes.io/projected/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-kube-api-access-pcb47\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.409889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" (UID: "3f4ba98e-f6f5-41cc-8618-22dfb8700b4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.469893 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520795 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" exitCode=0 Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520844 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlssc" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.520856 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.522181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlssc" event={"ID":"3f4ba98e-f6f5-41cc-8618-22dfb8700b4c","Type":"ContainerDied","Data":"556870de211a67a576b43464c6f861f745bcebf6538ad4c84adeb7c03d8872b7"} Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.522257 4781 scope.go:117] "RemoveContainer" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.556271 4781 scope.go:117] "RemoveContainer" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.583467 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.593863 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xlssc"] Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.709837 4781 scope.go:117] "RemoveContainer" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.782790 4781 scope.go:117] "RemoveContainer" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.790869 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": container with ID starting with 137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9 not found: ID does not exist" containerID="137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.790919 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9"} err="failed to get container status \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": rpc error: code = NotFound desc = could not find container \"137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9\": container with ID starting with 137a771e08a94de721dc80e4a4977a1120b1608ed3b54589424df944d9bdf9b9 not found: ID does not exist" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.790946 4781 scope.go:117] "RemoveContainer" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.792867 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": container with ID starting with f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9 not found: ID does not exist" containerID="f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.792923 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9"} err="failed to get container status \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": rpc error: code = NotFound desc = could not find container \"f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9\": container with ID starting with f9bd335f29629807774b995c1792f6d10b37004e22fc7bebdd3bf91a9936e8f9 not found: ID does not exist" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.792966 4781 scope.go:117] "RemoveContainer" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: E0227 00:50:50.793234 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": container with ID starting with e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613 not found: ID does not exist" containerID="e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613" Feb 27 00:50:50 crc kubenswrapper[4781]: I0227 00:50:50.793256 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613"} err="failed to get container status \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": rpc error: code = NotFound desc = could not find container \"e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613\": container with ID starting with e9a2b535c63935aa029b83cd83e6fcc05f884ee982b0306361645077d432a613 not found: ID does not exist" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.315615 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:50:51 crc kubenswrapper[4781]: E0227 00:50:51.315899 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.320865 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" path="/var/lib/kubelet/pods/3f4ba98e-f6f5-41cc-8618-22dfb8700b4c/volumes" Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.536420 4781 generic.go:334] "Generic (PLEG): container finished" podID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerID="892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae" exitCode=0 Feb 27 00:50:51 crc kubenswrapper[4781]: I0227 00:50:51.536470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerDied","Data":"892f0bf5e76001c655bd1216bacfc80b10ec06394101b3e897d30710d368bcae"} Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.090903 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136793 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136820 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136874 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136899 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136920 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.136997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") pod \"7a6c3903-7dfd-49cd-a92f-d138e10db404\" (UID: \"7a6c3903-7dfd-49cd-a92f-d138e10db404\") " Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.152810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.161855 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh" (OuterVolumeSpecName: "kube-api-access-78gjh") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "kube-api-access-78gjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.197014 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory" (OuterVolumeSpecName: "inventory") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242102 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242134 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242146 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gjh\" (UniqueName: \"kubernetes.io/projected/7a6c3903-7dfd-49cd-a92f-d138e10db404-kube-api-access-78gjh\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.242361 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.280127 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.297489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.297724 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7a6c3903-7dfd-49cd-a92f-d138e10db404" (UID: "7a6c3903-7dfd-49cd-a92f-d138e10db404"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350028 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350075 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350087 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.350099 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7a6c3903-7dfd-49cd-a92f-d138e10db404-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" event={"ID":"7a6c3903-7dfd-49cd-a92f-d138e10db404","Type":"ContainerDied","Data":"c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834"} Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556487 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98c73db13c2e6f240e777de96c450d5ef1c4ef457d610c978e3c63e24c6b834" Feb 27 00:50:53 crc kubenswrapper[4781]: I0227 00:50:53.556512 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs" Feb 27 00:50:54 crc kubenswrapper[4781]: I0227 00:50:54.898477 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:54 crc kubenswrapper[4781]: I0227 00:50:54.957546 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:55 crc kubenswrapper[4781]: I0227 00:50:55.712986 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:56 crc kubenswrapper[4781]: I0227 00:50:56.582362 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qswm5" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" containerID="cri-o://448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" gracePeriod=2 Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.140597 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227140 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227255 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.227562 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") pod \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\" (UID: \"e55472e5-5c75-4b68-9c22-dcf37baffe6a\") " Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.229130 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities" (OuterVolumeSpecName: "utilities") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.237872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b" (OuterVolumeSpecName: "kube-api-access-mtf8b") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "kube-api-access-mtf8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.284924 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e55472e5-5c75-4b68-9c22-dcf37baffe6a" (UID: "e55472e5-5c75-4b68-9c22-dcf37baffe6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333467 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333504 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e55472e5-5c75-4b68-9c22-dcf37baffe6a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.333515 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtf8b\" (UniqueName: \"kubernetes.io/projected/e55472e5-5c75-4b68-9c22-dcf37baffe6a-kube-api-access-mtf8b\") on node \"crc\" DevicePath \"\"" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593011 4781 generic.go:334] "Generic (PLEG): container finished" podID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" exitCode=0 Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593054 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qswm5" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qswm5" event={"ID":"e55472e5-5c75-4b68-9c22-dcf37baffe6a","Type":"ContainerDied","Data":"ec42b700956099171afc0f35a3868f09a1f40ac9aa51906b323a60c042870bbe"} Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.593111 4781 scope.go:117] "RemoveContainer" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.614450 4781 scope.go:117] "RemoveContainer" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.620323 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.632058 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qswm5"] Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.633898 4781 scope.go:117] "RemoveContainer" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.680852 4781 scope.go:117] "RemoveContainer" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.682438 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": container with ID starting with 448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6 not found: ID does not exist" containerID="448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.682474 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6"} err="failed to get container status \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": rpc error: code = NotFound desc = could not find container \"448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6\": container with ID starting with 448f33b37ba3a7f2f89fc829e1f27aba076c2ccace43bc5bc17bed3c43c44cd6 not found: ID does not exist" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.682494 4781 scope.go:117] "RemoveContainer" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.685003 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": container with ID starting with 4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269 not found: ID does not exist" containerID="4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.685063 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269"} err="failed to get container status \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": rpc error: code = NotFound desc = could not find container \"4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269\": container with ID starting with 4b2f47ab040f83d4d4d60dc545300765af7ebb8d17389a8134432cba5b907269 not found: ID does not exist" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.685093 4781 scope.go:117] "RemoveContainer" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: E0227 00:50:57.686588 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": container with ID starting with ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74 not found: ID does not exist" containerID="ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74" Feb 27 00:50:57 crc kubenswrapper[4781]: I0227 00:50:57.686646 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74"} err="failed to get container status \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": rpc error: code = NotFound desc = could not find container \"ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74\": container with ID starting with ca0d4a966b85127f600d15aa94b7dd95b92cd6c15a6584862d4a007c754f3f74 not found: ID does not exist" Feb 27 00:50:59 crc kubenswrapper[4781]: I0227 00:50:59.321824 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" path="/var/lib/kubelet/pods/e55472e5-5c75-4b68-9c22-dcf37baffe6a/volumes" Feb 27 00:51:06 crc kubenswrapper[4781]: I0227 00:51:06.310073 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:06 crc kubenswrapper[4781]: E0227 00:51:06.310834 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:20 crc kubenswrapper[4781]: I0227 00:51:20.309953 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:20 crc kubenswrapper[4781]: E0227 00:51:20.310735 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:31 crc kubenswrapper[4781]: I0227 00:51:31.319509 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:31 crc kubenswrapper[4781]: E0227 00:51:31.320374 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:51:44 crc kubenswrapper[4781]: I0227 00:51:44.309488 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:51:45 crc kubenswrapper[4781]: I0227 00:51:45.007072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.162294 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163363 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163379 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163412 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163419 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163432 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163438 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163450 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163456 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-utilities" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163467 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163473 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163480 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163487 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: E0227 00:52:00.163503 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163508 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="extract-content" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163743 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6c3903-7dfd-49cd-a92f-d138e10db404" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163768 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4ba98e-f6f5-41cc-8618-22dfb8700b4c" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.163778 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55472e5-5c75-4b68-9c22-dcf37baffe6a" containerName="registry-server" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.164596 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167280 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167726 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.167887 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.175252 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.270144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.372668 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.395377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"auto-csr-approver-29535892-5kr7b\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:00 crc kubenswrapper[4781]: I0227 00:52:00.489272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:01 crc kubenswrapper[4781]: I0227 00:52:01.007508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:52:01 crc kubenswrapper[4781]: I0227 00:52:01.161580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerStarted","Data":"81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6"} Feb 27 00:52:03 crc kubenswrapper[4781]: I0227 00:52:03.182091 4781 generic.go:334] "Generic (PLEG): container finished" podID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerID="f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a" exitCode=0 Feb 27 00:52:03 crc kubenswrapper[4781]: I0227 00:52:03.182188 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerDied","Data":"f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a"} Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.680246 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.766555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") pod \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\" (UID: \"7eb81f10-0ea8-4376-b588-a3d9462c0bc4\") " Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.772056 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6" (OuterVolumeSpecName: "kube-api-access-dhdm6") pod "7eb81f10-0ea8-4376-b588-a3d9462c0bc4" (UID: "7eb81f10-0ea8-4376-b588-a3d9462c0bc4"). InnerVolumeSpecName "kube-api-access-dhdm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:52:04 crc kubenswrapper[4781]: I0227 00:52:04.869475 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdm6\" (UniqueName: \"kubernetes.io/projected/7eb81f10-0ea8-4376-b588-a3d9462c0bc4-kube-api-access-dhdm6\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.201767 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" event={"ID":"7eb81f10-0ea8-4376-b588-a3d9462c0bc4","Type":"ContainerDied","Data":"81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6"} Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.202093 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f5d9ffd1253c0fe66074826a1be97377eb0943041ec8404b90de2bb8cd82b6" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.201845 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535892-5kr7b" Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.765322 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:52:05 crc kubenswrapper[4781]: I0227 00:52:05.775675 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535886-5khtq"] Feb 27 00:52:07 crc kubenswrapper[4781]: I0227 00:52:07.324387 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8143ddb0-990c-4f1e-9130-7ca30776e64b" path="/var/lib/kubelet/pods/8143ddb0-990c-4f1e-9130-7ca30776e64b/volumes" Feb 27 00:52:31 crc kubenswrapper[4781]: I0227 00:52:31.778666 4781 scope.go:117] "RemoveContainer" containerID="e0bb531ca8e9ee4c1a35ccb62422bfe50af2c334314f4bd145d5137b8ad741e6" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.931901 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:39 crc kubenswrapper[4781]: E0227 00:52:39.933375 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.933393 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.933599 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" containerName="oc" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.935299 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:39 crc kubenswrapper[4781]: I0227 00:52:39.943185 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061695 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.061856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.163966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164166 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.164674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.188886 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"community-operators-sfnkp\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.255164 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:40 crc kubenswrapper[4781]: I0227 00:52:40.816896 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579116 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" exitCode=0 Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a"} Feb 27 00:52:41 crc kubenswrapper[4781]: I0227 00:52:41.579420 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"3b315d0c2f14459090fc3e1cdd3ea378a65d651e1ca9cba884325f43c392d3da"} Feb 27 00:52:43 crc kubenswrapper[4781]: I0227 00:52:43.597617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} Feb 27 00:52:44 crc kubenswrapper[4781]: I0227 00:52:44.607859 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" exitCode=0 Feb 27 00:52:44 crc kubenswrapper[4781]: I0227 00:52:44.608062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} Feb 27 00:52:45 crc kubenswrapper[4781]: I0227 00:52:45.625491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerStarted","Data":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} Feb 27 00:52:45 crc kubenswrapper[4781]: I0227 00:52:45.654904 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfnkp" podStartSLOduration=3.255425093 podStartE2EDuration="6.65487939s" podCreationTimestamp="2026-02-27 00:52:39 +0000 UTC" firstStartedPulling="2026-02-27 00:52:41.582681111 +0000 UTC m=+2830.840220675" lastFinishedPulling="2026-02-27 00:52:44.982135418 +0000 UTC m=+2834.239674972" observedRunningTime="2026-02-27 00:52:45.644929978 +0000 UTC m=+2834.902469522" watchObservedRunningTime="2026-02-27 00:52:45.65487939 +0000 UTC m=+2834.912418954" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.255566 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.255917 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.309593 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.721408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:50 crc kubenswrapper[4781]: I0227 00:52:50.769753 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:52 crc kubenswrapper[4781]: I0227 00:52:52.692508 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sfnkp" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" containerID="cri-o://6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" gracePeriod=2 Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.362328 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.454017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.455933 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.456050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") pod \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\" (UID: \"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037\") " Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.459084 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities" (OuterVolumeSpecName: "utilities") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.478501 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8" (OuterVolumeSpecName: "kube-api-access-sqwm8") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "kube-api-access-sqwm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.511979 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" (UID: "d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560110 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqwm8\" (UniqueName: \"kubernetes.io/projected/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-kube-api-access-sqwm8\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.560448 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704678 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" exitCode=0 Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704799 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.705057 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfnkp" event={"ID":"d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037","Type":"ContainerDied","Data":"3b315d0c2f14459090fc3e1cdd3ea378a65d651e1ca9cba884325f43c392d3da"} Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.705083 4781 scope.go:117] "RemoveContainer" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.704835 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfnkp" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.766340 4781 scope.go:117] "RemoveContainer" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.777731 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.789254 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sfnkp"] Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.821561 4781 scope.go:117] "RemoveContainer" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869408 4781 scope.go:117] "RemoveContainer" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.869909 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": container with ID starting with 6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994 not found: ID does not exist" containerID="6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869946 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994"} err="failed to get container status \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": rpc error: code = NotFound desc = could not find container \"6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994\": container with ID starting with 6cbfe7be30a34a6be35d29972b8955367be6e146df54c84b4d37e8823dd35994 not found: ID does not exist" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.869971 4781 scope.go:117] "RemoveContainer" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.870335 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": container with ID starting with 8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698 not found: ID does not exist" containerID="8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.870366 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698"} err="failed to get container status \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": rpc error: code = NotFound desc = could not find container \"8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698\": container with ID starting with 8bd85a8915daea34807d1acc768da5abbcae463c7612652dcb0251531d04f698 not found: ID does not exist" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.870385 4781 scope.go:117] "RemoveContainer" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: E0227 00:52:53.871243 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": container with ID starting with 72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a not found: ID does not exist" containerID="72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a" Feb 27 00:52:53 crc kubenswrapper[4781]: I0227 00:52:53.871274 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a"} err="failed to get container status \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": rpc error: code = NotFound desc = could not find container \"72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a\": container with ID starting with 72d7728ad61520c86612aae9166887246813c276f46f9d51b03f34ba7849805a not found: ID does not exist" Feb 27 00:52:55 crc kubenswrapper[4781]: I0227 00:52:55.321351 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" path="/var/lib/kubelet/pods/d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037/volumes" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.153548 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157849 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-content" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157884 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-content" Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157915 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157924 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: E0227 00:54:00.157958 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-utilities" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.157968 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="extract-utilities" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.158265 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5c35b-c91d-4d87-89ff-1e0d9d6ef037" containerName="registry-server" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.159599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.161993 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.162435 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.163606 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.175893 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.299151 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.403419 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.425770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"auto-csr-approver-29535894-wdzjg\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.483845 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.971573 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 00:54:00 crc kubenswrapper[4781]: I0227 00:54:00.978602 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:54:01 crc kubenswrapper[4781]: I0227 00:54:01.426514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerStarted","Data":"02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14"} Feb 27 00:54:03 crc kubenswrapper[4781]: I0227 00:54:03.446832 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerID="cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee" exitCode=0 Feb 27 00:54:03 crc kubenswrapper[4781]: I0227 00:54:03.446892 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerDied","Data":"cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee"} Feb 27 00:54:04 crc kubenswrapper[4781]: I0227 00:54:04.905619 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.012687 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") pod \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\" (UID: \"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa\") " Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.019211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj" (OuterVolumeSpecName: "kube-api-access-ql7dj") pod "1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" (UID: "1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa"). InnerVolumeSpecName "kube-api-access-ql7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.115150 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql7dj\" (UniqueName: \"kubernetes.io/projected/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa-kube-api-access-ql7dj\") on node \"crc\" DevicePath \"\"" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" event={"ID":"1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa","Type":"ContainerDied","Data":"02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14"} Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468473 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02533eb2fb7ee8c0037fd145c3ab25c86e7b7f823eb3ca9cb44384e6c4541a14" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.468523 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535894-wdzjg" Feb 27 00:54:05 crc kubenswrapper[4781]: I0227 00:54:05.989730 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:54:06 crc kubenswrapper[4781]: I0227 00:54:06.001877 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535888-nb28f"] Feb 27 00:54:07 crc kubenswrapper[4781]: I0227 00:54:07.319719 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f1edd-305c-4a6c-bd4e-11c12c2ae515" path="/var/lib/kubelet/pods/231f1edd-305c-4a6c-bd4e-11c12c2ae515/volumes" Feb 27 00:54:12 crc kubenswrapper[4781]: I0227 00:54:12.895006 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:54:12 crc kubenswrapper[4781]: I0227 00:54:12.895422 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:54:31 crc kubenswrapper[4781]: I0227 00:54:31.907429 4781 scope.go:117] "RemoveContainer" containerID="2518570ffdceb97ceb198f4ca24bb08d3d0c202488b87c6e1650891fc7084042" Feb 27 00:54:42 crc kubenswrapper[4781]: I0227 00:54:42.895459 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:54:42 crc kubenswrapper[4781]: I0227 00:54:42.896010 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895195 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895737 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.895778 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.896552 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:55:12 crc kubenswrapper[4781]: I0227 00:55:12.896616 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" gracePeriod=600 Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089470 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" exitCode=0 Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20"} Feb 27 00:55:13 crc kubenswrapper[4781]: I0227 00:55:13.089549 4781 scope.go:117] "RemoveContainer" containerID="b43d442726c2fb32a9440da8ae9e4af3310909e5fb613884bf76597889808084" Feb 27 00:55:14 crc kubenswrapper[4781]: I0227 00:55:14.099604 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.155762 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:00 crc kubenswrapper[4781]: E0227 00:56:00.156903 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.156922 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.157132 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" containerName="oc" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.158094 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.161641 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.161662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.165047 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.166443 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.318056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.420614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.447703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"auto-csr-approver-29535896-46l2w\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.478878 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:00 crc kubenswrapper[4781]: I0227 00:56:00.923089 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 00:56:01 crc kubenswrapper[4781]: I0227 00:56:01.132649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerStarted","Data":"218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51"} Feb 27 00:56:03 crc kubenswrapper[4781]: I0227 00:56:03.151713 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerStarted","Data":"e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7"} Feb 27 00:56:03 crc kubenswrapper[4781]: I0227 00:56:03.169539 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535896-46l2w" podStartSLOduration=1.511518862 podStartE2EDuration="3.169519226s" podCreationTimestamp="2026-02-27 00:56:00 +0000 UTC" firstStartedPulling="2026-02-27 00:56:00.930056336 +0000 UTC m=+3030.187595890" lastFinishedPulling="2026-02-27 00:56:02.5880567 +0000 UTC m=+3031.845596254" observedRunningTime="2026-02-27 00:56:03.164279747 +0000 UTC m=+3032.421819321" watchObservedRunningTime="2026-02-27 00:56:03.169519226 +0000 UTC m=+3032.427058780" Feb 27 00:56:04 crc kubenswrapper[4781]: I0227 00:56:04.161443 4781 generic.go:334] "Generic (PLEG): container finished" podID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerID="e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7" exitCode=0 Feb 27 00:56:04 crc kubenswrapper[4781]: I0227 00:56:04.161553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerDied","Data":"e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7"} Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.579260 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.738019 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") pod \"4c6b6160-b122-4248-b7ed-a206d3bc633e\" (UID: \"4c6b6160-b122-4248-b7ed-a206d3bc633e\") " Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.751831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8" (OuterVolumeSpecName: "kube-api-access-2z7f8") pod "4c6b6160-b122-4248-b7ed-a206d3bc633e" (UID: "4c6b6160-b122-4248-b7ed-a206d3bc633e"). InnerVolumeSpecName "kube-api-access-2z7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:56:05 crc kubenswrapper[4781]: I0227 00:56:05.841115 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z7f8\" (UniqueName: \"kubernetes.io/projected/4c6b6160-b122-4248-b7ed-a206d3bc633e-kube-api-access-2z7f8\") on node \"crc\" DevicePath \"\"" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184233 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535896-46l2w" event={"ID":"4c6b6160-b122-4248-b7ed-a206d3bc633e","Type":"ContainerDied","Data":"218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51"} Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184284 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218f3c0a6c3ea1d377ad17763bf130dd08c0890823380392b00094a89bf1ab51" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.184291 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535896-46l2w" Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.238755 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:56:06 crc kubenswrapper[4781]: I0227 00:56:06.247398 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535890-nv49g"] Feb 27 00:56:07 crc kubenswrapper[4781]: I0227 00:56:07.320998 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fd3abb-2996-49d0-851b-41e0040438fa" path="/var/lib/kubelet/pods/88fd3abb-2996-49d0-851b-41e0040438fa/volumes" Feb 27 00:56:32 crc kubenswrapper[4781]: I0227 00:56:32.008442 4781 scope.go:117] "RemoveContainer" containerID="f2bb4ab5a55c1440d2f1c4f2cba63824f23a7c027f89afef51d965a575920c2b" Feb 27 00:57:42 crc kubenswrapper[4781]: I0227 00:57:42.900100 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:57:42 crc kubenswrapper[4781]: I0227 00:57:42.900752 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.161653 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:00 crc kubenswrapper[4781]: E0227 00:58:00.162965 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.162983 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.163243 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" containerName="oc" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.164202 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.167645 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.169821 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.170272 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.173425 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.188156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.290291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.312284 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"auto-csr-approver-29535898-vpdkx\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.484362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:00 crc kubenswrapper[4781]: I0227 00:58:00.941498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 00:58:01 crc kubenswrapper[4781]: I0227 00:58:01.257799 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerStarted","Data":"259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd"} Feb 27 00:58:02 crc kubenswrapper[4781]: I0227 00:58:02.270978 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerStarted","Data":"5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd"} Feb 27 00:58:02 crc kubenswrapper[4781]: I0227 00:58:02.291241 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" podStartSLOduration=1.314665779 podStartE2EDuration="2.291217992s" podCreationTimestamp="2026-02-27 00:58:00 +0000 UTC" firstStartedPulling="2026-02-27 00:58:00.950655093 +0000 UTC m=+3150.208194647" lastFinishedPulling="2026-02-27 00:58:01.927207306 +0000 UTC m=+3151.184746860" observedRunningTime="2026-02-27 00:58:02.291202331 +0000 UTC m=+3151.548741885" watchObservedRunningTime="2026-02-27 00:58:02.291217992 +0000 UTC m=+3151.548757546" Feb 27 00:58:03 crc kubenswrapper[4781]: I0227 00:58:03.295526 4781 generic.go:334] "Generic (PLEG): container finished" podID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerID="5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd" exitCode=0 Feb 27 00:58:03 crc kubenswrapper[4781]: I0227 00:58:03.295581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerDied","Data":"5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd"} Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.742585 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.874534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") pod \"b518ad5e-0994-4767-9c6d-d2ca11998a43\" (UID: \"b518ad5e-0994-4767-9c6d-d2ca11998a43\") " Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.881956 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k" (OuterVolumeSpecName: "kube-api-access-4548k") pod "b518ad5e-0994-4767-9c6d-d2ca11998a43" (UID: "b518ad5e-0994-4767-9c6d-d2ca11998a43"). InnerVolumeSpecName "kube-api-access-4548k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:58:04 crc kubenswrapper[4781]: I0227 00:58:04.977459 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4548k\" (UniqueName: \"kubernetes.io/projected/b518ad5e-0994-4767-9c6d-d2ca11998a43-kube-api-access-4548k\") on node \"crc\" DevicePath \"\"" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.316226 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.322075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535898-vpdkx" event={"ID":"b518ad5e-0994-4767-9c6d-d2ca11998a43","Type":"ContainerDied","Data":"259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd"} Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.322122 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259afc3c6a121e274fc6ab9d86321b3db6f85807994694abde31715461c872fd" Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.820268 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:58:05 crc kubenswrapper[4781]: I0227 00:58:05.831091 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535892-5kr7b"] Feb 27 00:58:07 crc kubenswrapper[4781]: I0227 00:58:07.322924 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb81f10-0ea8-4376-b588-a3d9462c0bc4" path="/var/lib/kubelet/pods/7eb81f10-0ea8-4376-b588-a3d9462c0bc4/volumes" Feb 27 00:58:12 crc kubenswrapper[4781]: I0227 00:58:12.895155 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:58:12 crc kubenswrapper[4781]: I0227 00:58:12.895645 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:32 crc kubenswrapper[4781]: I0227 00:58:32.109205 4781 scope.go:117] "RemoveContainer" containerID="f073e5337bf518b81829559352fcea1859d4a5ced7771a4c11f45807c039ab0a" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.895537 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.896215 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.896273 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.897143 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 00:58:42 crc kubenswrapper[4781]: I0227 00:58:42.897197 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" gracePeriod=600 Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.231419 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c19e2e_0830_47a5_9ea8_862e1c9d8571.slice/crio-conmon-af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.522507 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687014 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" exitCode=0 Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3"} Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687098 4781 scope.go:117] "RemoveContainer" containerID="9eb13c9d0480acfdc7ab15c203347b34a63e0504efc9127264d926b2dd0b3a20" Feb 27 00:58:43 crc kubenswrapper[4781]: I0227 00:58:43.687804 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:58:43 crc kubenswrapper[4781]: E0227 00:58:43.688117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:58:56 crc kubenswrapper[4781]: I0227 00:58:56.309955 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:58:56 crc kubenswrapper[4781]: E0227 00:58:56.310868 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:08 crc kubenswrapper[4781]: I0227 00:59:08.309468 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:08 crc kubenswrapper[4781]: E0227 00:59:08.310180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.388832 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:11 crc kubenswrapper[4781]: E0227 00:59:11.391091 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.391128 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.391361 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" containerName="oc" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.393311 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.403578 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573292 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.573445 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.675238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.676079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.676260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.702614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"redhat-marketplace-f2nl8\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:11 crc kubenswrapper[4781]: I0227 00:59:11.716345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.173151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958020 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" exitCode=0 Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958255 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4"} Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.958282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"e8b6bf74d32142b7644d81c9992f7b8fe97a8df0b6f36a4ecbde8a8ead769a8e"} Feb 27 00:59:12 crc kubenswrapper[4781]: I0227 00:59:12.960453 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 00:59:13 crc kubenswrapper[4781]: I0227 00:59:13.968172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} Feb 27 00:59:14 crc kubenswrapper[4781]: I0227 00:59:14.979734 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" exitCode=0 Feb 27 00:59:14 crc kubenswrapper[4781]: I0227 00:59:14.979815 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} Feb 27 00:59:15 crc kubenswrapper[4781]: I0227 00:59:15.998845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerStarted","Data":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} Feb 27 00:59:16 crc kubenswrapper[4781]: I0227 00:59:16.023459 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2nl8" podStartSLOduration=2.522354554 podStartE2EDuration="5.023441483s" podCreationTimestamp="2026-02-27 00:59:11 +0000 UTC" firstStartedPulling="2026-02-27 00:59:12.960145797 +0000 UTC m=+3222.217685351" lastFinishedPulling="2026-02-27 00:59:15.461232726 +0000 UTC m=+3224.718772280" observedRunningTime="2026-02-27 00:59:16.016443308 +0000 UTC m=+3225.273982872" watchObservedRunningTime="2026-02-27 00:59:16.023441483 +0000 UTC m=+3225.280981037" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.321310 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:21 crc kubenswrapper[4781]: E0227 00:59:21.322117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.716649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.717082 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:21 crc kubenswrapper[4781]: I0227 00:59:21.773377 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:22 crc kubenswrapper[4781]: I0227 00:59:22.109309 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:22 crc kubenswrapper[4781]: I0227 00:59:22.156496 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.072848 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f2nl8" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" containerID="cri-o://89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" gracePeriod=2 Feb 27 00:59:24 crc kubenswrapper[4781]: E0227 00:59:24.241602 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd5f2a_600a_4378_9ec7_133418b38ffe.slice/crio-conmon-89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd5f2a_600a_4378_9ec7_133418b38ffe.slice/crio-89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485.scope\": RecentStats: unable to find data in memory cache]" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.766243 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860278 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.860362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") pod \"debd5f2a-600a-4378-9ec7-133418b38ffe\" (UID: \"debd5f2a-600a-4378-9ec7-133418b38ffe\") " Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.861211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities" (OuterVolumeSpecName: "utilities") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.861694 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.876107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl" (OuterVolumeSpecName: "kube-api-access-fxlwl") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "kube-api-access-fxlwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.899675 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "debd5f2a-600a-4378-9ec7-133418b38ffe" (UID: "debd5f2a-600a-4378-9ec7-133418b38ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.963843 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxlwl\" (UniqueName: \"kubernetes.io/projected/debd5f2a-600a-4378-9ec7-133418b38ffe-kube-api-access-fxlwl\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:24 crc kubenswrapper[4781]: I0227 00:59:24.964220 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debd5f2a-600a-4378-9ec7-133418b38ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083299 4781 generic.go:334] "Generic (PLEG): container finished" podID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" exitCode=0 Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083352 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2nl8" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083381 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2nl8" event={"ID":"debd5f2a-600a-4378-9ec7-133418b38ffe","Type":"ContainerDied","Data":"e8b6bf74d32142b7644d81c9992f7b8fe97a8df0b6f36a4ecbde8a8ead769a8e"} Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.083399 4781 scope.go:117] "RemoveContainer" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.110054 4781 scope.go:117] "RemoveContainer" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.123745 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.136433 4781 scope.go:117] "RemoveContainer" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.136665 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2nl8"] Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.192421 4781 scope.go:117] "RemoveContainer" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.192907 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": container with ID starting with 89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485 not found: ID does not exist" containerID="89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.192976 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485"} err="failed to get container status \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": rpc error: code = NotFound desc = could not find container \"89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485\": container with ID starting with 89bc8a1225ba837f99a6f028bb3f296fc1439e14885772a35fcbb879367a3485 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.193004 4781 scope.go:117] "RemoveContainer" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.194383 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": container with ID starting with d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55 not found: ID does not exist" containerID="d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.194441 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55"} err="failed to get container status \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": rpc error: code = NotFound desc = could not find container \"d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55\": container with ID starting with d1d6ffe688294c6db8f240bfa49c3f89258f0257c99f53684605e922f3d8ed55 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.194477 4781 scope.go:117] "RemoveContainer" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: E0227 00:59:25.195048 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": container with ID starting with ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4 not found: ID does not exist" containerID="ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.195734 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4"} err="failed to get container status \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": rpc error: code = NotFound desc = could not find container \"ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4\": container with ID starting with ffccf986b660b401a5f6dbb81c6d3a3d7d1abe6a0be0ca0fb647a5bfd322cdb4 not found: ID does not exist" Feb 27 00:59:25 crc kubenswrapper[4781]: I0227 00:59:25.321368 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" path="/var/lib/kubelet/pods/debd5f2a-600a-4378-9ec7-133418b38ffe/volumes" Feb 27 00:59:35 crc kubenswrapper[4781]: I0227 00:59:35.311307 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:35 crc kubenswrapper[4781]: E0227 00:59:35.312071 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 00:59:50 crc kubenswrapper[4781]: I0227 00:59:50.309297 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 00:59:50 crc kubenswrapper[4781]: E0227 00:59:50.310080 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.182036 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183155 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183172 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183193 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-content" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183203 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-content" Feb 27 01:00:00 crc kubenswrapper[4781]: E0227 01:00:00.183226 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-utilities" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183235 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="extract-utilities" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.183535 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="debd5f2a-600a-4378-9ec7-133418b38ffe" containerName="registry-server" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.184421 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.187729 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.188260 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.189935 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.195413 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.197307 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.200776 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.203154 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.210755 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.219417 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.297918 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298028 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298138 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.298277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399827 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.399902 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.400913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.406676 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.419812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"auto-csr-approver-29535900-6n6zr\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.420493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"collect-profiles-29535900-d9mbc\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.511308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:00 crc kubenswrapper[4781]: I0227 01:00:00.526563 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.000735 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc"] Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.105433 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:00:01 crc kubenswrapper[4781]: W0227 01:00:01.124915 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aac78d6_5f5c_4b48_95f2_554353abcdd3.slice/crio-f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe WatchSource:0}: Error finding container f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe: Status 404 returned error can't find the container with id f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.425803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerStarted","Data":"f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe"} Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427380 4781 generic.go:334] "Generic (PLEG): container finished" podID="4be24050-0334-412d-9c01-525815caef28" containerID="88af5d2f7a10fab35719beb6fe30d40bbd5167a8ef25825d9123b0d5e4f7e563" exitCode=0 Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerDied","Data":"88af5d2f7a10fab35719beb6fe30d40bbd5167a8ef25825d9123b0d5e4f7e563"} Feb 27 01:00:01 crc kubenswrapper[4781]: I0227 01:00:01.427473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerStarted","Data":"9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f"} Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.310425 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:02 crc kubenswrapper[4781]: E0227 01:00:02.311313 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.875861 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961929 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.961969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") pod \"4be24050-0334-412d-9c01-525815caef28\" (UID: \"4be24050-0334-412d-9c01-525815caef28\") " Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.963647 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume" (OuterVolumeSpecName: "config-volume") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.969523 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:00:02 crc kubenswrapper[4781]: I0227 01:00:02.974407 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs" (OuterVolumeSpecName: "kube-api-access-f99cs") pod "4be24050-0334-412d-9c01-525815caef28" (UID: "4be24050-0334-412d-9c01-525815caef28"). InnerVolumeSpecName "kube-api-access-f99cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064266 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4be24050-0334-412d-9c01-525815caef28-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f99cs\" (UniqueName: \"kubernetes.io/projected/4be24050-0334-412d-9c01-525815caef28-kube-api-access-f99cs\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.064374 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4be24050-0334-412d-9c01-525815caef28-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" event={"ID":"4be24050-0334-412d-9c01-525815caef28","Type":"ContainerDied","Data":"9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f"} Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451218 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aada356d2480234caecb610cda7a6bb9e6ec4d19386ef97e88fd09cb849c15f" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.451278 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-d9mbc" Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.945936 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 01:00:03 crc kubenswrapper[4781]: I0227 01:00:03.955202 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535855-hf24h"] Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.357101 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b44418-6039-4859-96ba-1442e52b290e" path="/var/lib/kubelet/pods/22b44418-6039-4859-96ba-1442e52b290e/volumes" Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.473421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerStarted","Data":"fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04"} Feb 27 01:00:05 crc kubenswrapper[4781]: I0227 01:00:05.497506 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" podStartSLOduration=1.8308608990000002 podStartE2EDuration="5.497483978s" podCreationTimestamp="2026-02-27 01:00:00 +0000 UTC" firstStartedPulling="2026-02-27 01:00:01.12980955 +0000 UTC m=+3270.387349114" lastFinishedPulling="2026-02-27 01:00:04.796432639 +0000 UTC m=+3274.053972193" observedRunningTime="2026-02-27 01:00:05.486710333 +0000 UTC m=+3274.744249907" watchObservedRunningTime="2026-02-27 01:00:05.497483978 +0000 UTC m=+3274.755023542" Feb 27 01:00:06 crc kubenswrapper[4781]: I0227 01:00:06.484237 4781 generic.go:334] "Generic (PLEG): container finished" podID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerID="fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04" exitCode=0 Feb 27 01:00:06 crc kubenswrapper[4781]: I0227 01:00:06.484296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerDied","Data":"fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04"} Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.929230 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.964048 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") pod \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\" (UID: \"0aac78d6-5f5c-4b48-95f2-554353abcdd3\") " Feb 27 01:00:07 crc kubenswrapper[4781]: I0227 01:00:07.970108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss" (OuterVolumeSpecName: "kube-api-access-srjss") pod "0aac78d6-5f5c-4b48-95f2-554353abcdd3" (UID: "0aac78d6-5f5c-4b48-95f2-554353abcdd3"). InnerVolumeSpecName "kube-api-access-srjss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.066974 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srjss\" (UniqueName: \"kubernetes.io/projected/0aac78d6-5f5c-4b48-95f2-554353abcdd3-kube-api-access-srjss\") on node \"crc\" DevicePath \"\"" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.505903 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" event={"ID":"0aac78d6-5f5c-4b48-95f2-554353abcdd3","Type":"ContainerDied","Data":"f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe"} Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.506248 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d1c70db713d929fdac86351a40f8e0c653d199ec79319ad594457a51783afe" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.505944 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535900-6n6zr" Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.557799 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 01:00:08 crc kubenswrapper[4781]: I0227 01:00:08.571336 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535894-wdzjg"] Feb 27 01:00:09 crc kubenswrapper[4781]: I0227 01:00:09.321371 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa" path="/var/lib/kubelet/pods/1b3eaaa4-e7c4-4d5d-911b-cde5d563fcfa/volumes" Feb 27 01:00:17 crc kubenswrapper[4781]: I0227 01:00:17.310100 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:17 crc kubenswrapper[4781]: E0227 01:00:17.310741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:30 crc kubenswrapper[4781]: I0227 01:00:30.310538 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:30 crc kubenswrapper[4781]: E0227 01:00:30.311392 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:32 crc kubenswrapper[4781]: I0227 01:00:32.233545 4781 scope.go:117] "RemoveContainer" containerID="1f1ef56dac2e7ed3023bb30987d569aec06c9a96b99c1e9e939085397f33ecaf" Feb 27 01:00:32 crc kubenswrapper[4781]: I0227 01:00:32.274224 4781 scope.go:117] "RemoveContainer" containerID="cd5eb21f935374fa744e81c6189b26e6ff6841a0ef882762f86735b2bdaec5ee" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.868458 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:40 crc kubenswrapper[4781]: E0227 01:00:40.870422 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.870443 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: E0227 01:00:40.870469 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.870478 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.882429 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be24050-0334-412d-9c01-525815caef28" containerName="collect-profiles" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.882478 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" containerName="oc" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.886903 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.891242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:40 crc kubenswrapper[4781]: I0227 01:00:40.946182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048376 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.048980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.077653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"redhat-operators-cwjz6\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.221836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.757978 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:00:41 crc kubenswrapper[4781]: I0227 01:00:41.825352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"ce0753b848865fc62648932e9ede65f00c3e009f79642c333fed3ad246907b56"} Feb 27 01:00:42 crc kubenswrapper[4781]: I0227 01:00:42.835798 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" exitCode=0 Feb 27 01:00:42 crc kubenswrapper[4781]: I0227 01:00:42.835838 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4"} Feb 27 01:00:44 crc kubenswrapper[4781]: I0227 01:00:44.309172 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:44 crc kubenswrapper[4781]: E0227 01:00:44.309892 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:00:59 crc kubenswrapper[4781]: I0227 01:00:59.309776 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:00:59 crc kubenswrapper[4781]: E0227 01:00:59.310581 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.152699 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.154116 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.170107 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.260702 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.260993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.261093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.261157 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363924 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.363958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.364014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.370148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.371233 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.371257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.381089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"keystone-cron-29535901-2chr7\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.478166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:00 crc kubenswrapper[4781]: I0227 01:01:00.981524 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535901-2chr7"] Feb 27 01:01:01 crc kubenswrapper[4781]: I0227 01:01:01.019915 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.030751 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerStarted","Data":"7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.031121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerStarted","Data":"537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce"} Feb 27 01:01:02 crc kubenswrapper[4781]: I0227 01:01:02.049237 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535901-2chr7" podStartSLOduration=2.049216542 podStartE2EDuration="2.049216542s" podCreationTimestamp="2026-02-27 01:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:01:02.04500643 +0000 UTC m=+3331.302545994" watchObservedRunningTime="2026-02-27 01:01:02.049216542 +0000 UTC m=+3331.306756096" Feb 27 01:01:06 crc kubenswrapper[4781]: I0227 01:01:06.067248 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" exitCode=0 Feb 27 01:01:06 crc kubenswrapper[4781]: I0227 01:01:06.067319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.081338 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerStarted","Data":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.083282 4781 generic.go:334] "Generic (PLEG): container finished" podID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerID="7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0" exitCode=0 Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.083329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerDied","Data":"7557da63e2de03059baf804218735736137f6a0ae74a8bf68e61c8dde24476f0"} Feb 27 01:01:07 crc kubenswrapper[4781]: I0227 01:01:07.109343 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cwjz6" podStartSLOduration=3.192889401 podStartE2EDuration="27.10932431s" podCreationTimestamp="2026-02-27 01:00:40 +0000 UTC" firstStartedPulling="2026-02-27 01:00:42.837876343 +0000 UTC m=+3312.095415887" lastFinishedPulling="2026-02-27 01:01:06.754311242 +0000 UTC m=+3336.011850796" observedRunningTime="2026-02-27 01:01:07.102994903 +0000 UTC m=+3336.360534457" watchObservedRunningTime="2026-02-27 01:01:07.10932431 +0000 UTC m=+3336.366863864" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.617611 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650295 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650376 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.650643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") pod \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\" (UID: \"8f6a0640-2204-47a2-a550-7a7bb14ebc0d\") " Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.658721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.667860 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4" (OuterVolumeSpecName: "kube-api-access-2lts4") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "kube-api-access-2lts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.697054 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.724071 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data" (OuterVolumeSpecName: "config-data") pod "8f6a0640-2204-47a2-a550-7a7bb14ebc0d" (UID: "8f6a0640-2204-47a2-a550-7a7bb14ebc0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.752971 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753022 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lts4\" (UniqueName: \"kubernetes.io/projected/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-kube-api-access-2lts4\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753048 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:08 crc kubenswrapper[4781]: I0227 01:01:08.753059 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a0640-2204-47a2-a550-7a7bb14ebc0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104757 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535901-2chr7" event={"ID":"8f6a0640-2204-47a2-a550-7a7bb14ebc0d","Type":"ContainerDied","Data":"537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce"} Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104805 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537cda34ad3149982b0c8785cb238f7f78c9e98fdecdc0beb26e59d3dd7545ce" Feb 27 01:01:09 crc kubenswrapper[4781]: I0227 01:01:09.104841 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535901-2chr7" Feb 27 01:01:11 crc kubenswrapper[4781]: I0227 01:01:11.222087 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:11 crc kubenswrapper[4781]: I0227 01:01:11.222701 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:12 crc kubenswrapper[4781]: I0227 01:01:12.271651 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:12 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:12 crc kubenswrapper[4781]: > Feb 27 01:01:14 crc kubenswrapper[4781]: I0227 01:01:14.310167 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:14 crc kubenswrapper[4781]: E0227 01:01:14.310738 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:22 crc kubenswrapper[4781]: I0227 01:01:22.269433 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:22 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:22 crc kubenswrapper[4781]: > Feb 27 01:01:25 crc kubenswrapper[4781]: I0227 01:01:25.309404 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:25 crc kubenswrapper[4781]: E0227 01:01:25.310347 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:32 crc kubenswrapper[4781]: I0227 01:01:32.269559 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" probeResult="failure" output=< Feb 27 01:01:32 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:01:32 crc kubenswrapper[4781]: > Feb 27 01:01:39 crc kubenswrapper[4781]: I0227 01:01:39.309550 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:39 crc kubenswrapper[4781]: E0227 01:01:39.310462 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:41 crc kubenswrapper[4781]: I0227 01:01:41.273199 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:41 crc kubenswrapper[4781]: I0227 01:01:41.324404 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:42 crc kubenswrapper[4781]: I0227 01:01:42.085873 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:42 crc kubenswrapper[4781]: I0227 01:01:42.459552 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cwjz6" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" containerID="cri-o://fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" gracePeriod=2 Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.026295 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150285 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.150709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") pod \"8645ad19-e972-4563-8c61-0b409e68654f\" (UID: \"8645ad19-e972-4563-8c61-0b409e68654f\") " Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.151525 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities" (OuterVolumeSpecName: "utilities") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.152301 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.164538 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9" (OuterVolumeSpecName: "kube-api-access-2gkj9") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "kube-api-access-2gkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.254990 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkj9\" (UniqueName: \"kubernetes.io/projected/8645ad19-e972-4563-8c61-0b409e68654f-kube-api-access-2gkj9\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.278607 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8645ad19-e972-4563-8c61-0b409e68654f" (UID: "8645ad19-e972-4563-8c61-0b409e68654f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.358913 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8645ad19-e972-4563-8c61-0b409e68654f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.474921 4781 generic.go:334] "Generic (PLEG): container finished" podID="8645ad19-e972-4563-8c61-0b409e68654f" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" exitCode=0 Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.474970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475001 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwjz6" event={"ID":"8645ad19-e972-4563-8c61-0b409e68654f","Type":"ContainerDied","Data":"ce0753b848865fc62648932e9ede65f00c3e009f79642c333fed3ad246907b56"} Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475017 4781 scope.go:117] "RemoveContainer" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.475155 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwjz6" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.495875 4781 scope.go:117] "RemoveContainer" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.501089 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.511088 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cwjz6"] Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.528425 4781 scope.go:117] "RemoveContainer" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.619902 4781 scope.go:117] "RemoveContainer" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.620995 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": container with ID starting with fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1 not found: ID does not exist" containerID="fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621048 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1"} err="failed to get container status \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": rpc error: code = NotFound desc = could not find container \"fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1\": container with ID starting with fc7360c5fb835cbe31d71a0fcfd8acf8a19ca0f33fe6c4dd9c1917f775a9f9e1 not found: ID does not exist" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621080 4781 scope.go:117] "RemoveContainer" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.621929 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": container with ID starting with 8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb not found: ID does not exist" containerID="8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621970 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb"} err="failed to get container status \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": rpc error: code = NotFound desc = could not find container \"8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb\": container with ID starting with 8f551137935dbc4a86b866d516940321cb5ee44d8037e609568d70411e9d07fb not found: ID does not exist" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.621997 4781 scope.go:117] "RemoveContainer" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: E0227 01:01:43.623162 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": container with ID starting with f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4 not found: ID does not exist" containerID="f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4" Feb 27 01:01:43 crc kubenswrapper[4781]: I0227 01:01:43.623205 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4"} err="failed to get container status \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": rpc error: code = NotFound desc = could not find container \"f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4\": container with ID starting with f8b0aa733e6926d4ead2ba5159b9d6863acb414faa55e255836a8eda034492d4 not found: ID does not exist" Feb 27 01:01:45 crc kubenswrapper[4781]: I0227 01:01:45.326607 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8645ad19-e972-4563-8c61-0b409e68654f" path="/var/lib/kubelet/pods/8645ad19-e972-4563-8c61-0b409e68654f/volumes" Feb 27 01:01:54 crc kubenswrapper[4781]: I0227 01:01:54.309919 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:01:54 crc kubenswrapper[4781]: E0227 01:01:54.312045 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.251669 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252764 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-content" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252782 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-content" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252802 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252811 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252832 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252840 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: E0227 01:01:58.252872 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-utilities" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.252880 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="extract-utilities" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.253072 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8645ad19-e972-4563-8c61-0b409e68654f" containerName="registry-server" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.253084 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6a0640-2204-47a2-a550-7a7bb14ebc0d" containerName="keystone-cron" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.254527 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.273135 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.383932 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.384323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.384449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.487959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488322 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.488931 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.489160 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.512365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"certified-operators-jdd4z\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:58 crc kubenswrapper[4781]: I0227 01:01:58.582032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.140756 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638325 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" exitCode=0 Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638368 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808"} Feb 27 01:01:59 crc kubenswrapper[4781]: I0227 01:01:59.638394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"df273d841b3449944d268e56a0e28f11d82f21d7f09bba2a5b92b1c4ad1eb152"} Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.179923 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.181414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.185961 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.186817 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.199087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.199282 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.328045 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.431392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.450700 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"auto-csr-approver-29535902-bkgwj\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:00 crc kubenswrapper[4781]: I0227 01:02:00.499680 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.010691 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.662536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerStarted","Data":"ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4"} Feb 27 01:02:01 crc kubenswrapper[4781]: I0227 01:02:01.665863 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.695242 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" exitCode=0 Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.695327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.699101 4781 generic.go:334] "Generic (PLEG): container finished" podID="da201e7f-72da-4998-8ecb-98a8814f423d" containerID="cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd" exitCode=0 Feb 27 01:02:04 crc kubenswrapper[4781]: I0227 01:02:04.699146 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerDied","Data":"cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd"} Feb 27 01:02:05 crc kubenswrapper[4781]: I0227 01:02:05.310401 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:05 crc kubenswrapper[4781]: E0227 01:02:05.310670 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.168509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.252889 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") pod \"da201e7f-72da-4998-8ecb-98a8814f423d\" (UID: \"da201e7f-72da-4998-8ecb-98a8814f423d\") " Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.267987 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8" (OuterVolumeSpecName: "kube-api-access-682f8") pod "da201e7f-72da-4998-8ecb-98a8814f423d" (UID: "da201e7f-72da-4998-8ecb-98a8814f423d"). InnerVolumeSpecName "kube-api-access-682f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.355844 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-682f8\" (UniqueName: \"kubernetes.io/projected/da201e7f-72da-4998-8ecb-98a8814f423d-kube-api-access-682f8\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" event={"ID":"da201e7f-72da-4998-8ecb-98a8814f423d","Type":"ContainerDied","Data":"ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4"} Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723563 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1dabaf3614d6322961d2b5cde702b8395ef307cf5a984baf680198a1f73dc4" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.723522 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535902-bkgwj" Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.726621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerStarted","Data":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} Feb 27 01:02:06 crc kubenswrapper[4781]: I0227 01:02:06.747337 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdd4z" podStartSLOduration=2.658468756 podStartE2EDuration="8.747302957s" podCreationTimestamp="2026-02-27 01:01:58 +0000 UTC" firstStartedPulling="2026-02-27 01:01:59.640948349 +0000 UTC m=+3388.898487893" lastFinishedPulling="2026-02-27 01:02:05.72978254 +0000 UTC m=+3394.987322094" observedRunningTime="2026-02-27 01:02:06.746359422 +0000 UTC m=+3396.003898976" watchObservedRunningTime="2026-02-27 01:02:06.747302957 +0000 UTC m=+3396.004842511" Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.243475 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.256509 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535896-46l2w"] Feb 27 01:02:07 crc kubenswrapper[4781]: I0227 01:02:07.320524 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6b6160-b122-4248-b7ed-a206d3bc633e" path="/var/lib/kubelet/pods/4c6b6160-b122-4248-b7ed-a206d3bc633e/volumes" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.584060 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.584124 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:08 crc kubenswrapper[4781]: I0227 01:02:08.638253 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:17 crc kubenswrapper[4781]: I0227 01:02:17.309249 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:17 crc kubenswrapper[4781]: E0227 01:02:17.309914 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.646984 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.705338 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:18 crc kubenswrapper[4781]: I0227 01:02:18.849689 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdd4z" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" containerID="cri-o://0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" gracePeriod=2 Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.503965 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663294 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.663457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") pod \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\" (UID: \"3f253a7c-3c8a-48a7-a91f-676dd51d64bf\") " Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.664425 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities" (OuterVolumeSpecName: "utilities") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.669152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth" (OuterVolumeSpecName: "kube-api-access-2bsth") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "kube-api-access-2bsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.708148 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f253a7c-3c8a-48a7-a91f-676dd51d64bf" (UID: "3f253a7c-3c8a-48a7-a91f-676dd51d64bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766246 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bsth\" (UniqueName: \"kubernetes.io/projected/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-kube-api-access-2bsth\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766278 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.766288 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f253a7c-3c8a-48a7-a91f-676dd51d64bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860903 4781 generic.go:334] "Generic (PLEG): container finished" podID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" exitCode=0 Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860950 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860980 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdd4z" event={"ID":"3f253a7c-3c8a-48a7-a91f-676dd51d64bf","Type":"ContainerDied","Data":"df273d841b3449944d268e56a0e28f11d82f21d7f09bba2a5b92b1c4ad1eb152"} Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860957 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdd4z" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.860999 4781 scope.go:117] "RemoveContainer" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.889544 4781 scope.go:117] "RemoveContainer" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.901534 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.911596 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdd4z"] Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.916748 4781 scope.go:117] "RemoveContainer" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960064 4781 scope.go:117] "RemoveContainer" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.960611 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": container with ID starting with 0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56 not found: ID does not exist" containerID="0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960677 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56"} err="failed to get container status \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": rpc error: code = NotFound desc = could not find container \"0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56\": container with ID starting with 0e7be5eda43a602846f569508d5e00bb38d1c5988fa08da324bcd0b474df1c56 not found: ID does not exist" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.960715 4781 scope.go:117] "RemoveContainer" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.961132 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": container with ID starting with a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a not found: ID does not exist" containerID="a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961222 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a"} err="failed to get container status \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": rpc error: code = NotFound desc = could not find container \"a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a\": container with ID starting with a1968d8d2992e2ec646c1d24e0978666aa16db8d0c1b7661201047f138ec5f5a not found: ID does not exist" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961292 4781 scope.go:117] "RemoveContainer" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: E0227 01:02:19.961587 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": container with ID starting with 026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808 not found: ID does not exist" containerID="026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808" Feb 27 01:02:19 crc kubenswrapper[4781]: I0227 01:02:19.961613 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808"} err="failed to get container status \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": rpc error: code = NotFound desc = could not find container \"026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808\": container with ID starting with 026d3830c0bb6ea15198209fa6b318805644984b6eb7061dd4202f39bdc8a808 not found: ID does not exist" Feb 27 01:02:21 crc kubenswrapper[4781]: I0227 01:02:21.323159 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" path="/var/lib/kubelet/pods/3f253a7c-3c8a-48a7-a91f-676dd51d64bf/volumes" Feb 27 01:02:29 crc kubenswrapper[4781]: I0227 01:02:29.311802 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:29 crc kubenswrapper[4781]: E0227 01:02:29.314701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:32 crc kubenswrapper[4781]: I0227 01:02:32.414395 4781 scope.go:117] "RemoveContainer" containerID="e8e722eebfb284cc61eb30213644cd7eb1815f8a77725715668d5116c8a7d0d7" Feb 27 01:02:42 crc kubenswrapper[4781]: I0227 01:02:42.310784 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:42 crc kubenswrapper[4781]: E0227 01:02:42.311381 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:02:55 crc kubenswrapper[4781]: I0227 01:02:55.310313 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:02:55 crc kubenswrapper[4781]: E0227 01:02:55.311856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.983131 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984222 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984238 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984269 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-content" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984278 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-content" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984294 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984301 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: E0227 01:03:06.984332 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-utilities" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984339 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="extract-utilities" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984560 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f253a7c-3c8a-48a7-a91f-676dd51d64bf" containerName="registry-server" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.984583 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" containerName="oc" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.986598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:06 crc kubenswrapper[4781]: I0227 01:03:06.999484 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100574 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.100656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.203901 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204091 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.204709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.226913 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"community-operators-kdj4m\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.310277 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:07 crc kubenswrapper[4781]: E0227 01:03:07.310598 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.318688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:07 crc kubenswrapper[4781]: I0227 01:03:07.915301 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.325974 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" exitCode=0 Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.326027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae"} Feb 27 01:03:08 crc kubenswrapper[4781]: I0227 01:03:08.326064 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"2198efaa880f708ce961cdfd28d8b51bfa029ee0f97064cd5f306fafb07ec0f2"} Feb 27 01:03:09 crc kubenswrapper[4781]: I0227 01:03:09.338743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} Feb 27 01:03:11 crc kubenswrapper[4781]: I0227 01:03:11.363480 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" exitCode=0 Feb 27 01:03:11 crc kubenswrapper[4781]: I0227 01:03:11.363575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} Feb 27 01:03:13 crc kubenswrapper[4781]: I0227 01:03:13.390488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerStarted","Data":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} Feb 27 01:03:13 crc kubenswrapper[4781]: I0227 01:03:13.416468 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdj4m" podStartSLOduration=3.559889456 podStartE2EDuration="7.416370924s" podCreationTimestamp="2026-02-27 01:03:06 +0000 UTC" firstStartedPulling="2026-02-27 01:03:08.328574633 +0000 UTC m=+3457.586114187" lastFinishedPulling="2026-02-27 01:03:12.185056101 +0000 UTC m=+3461.442595655" observedRunningTime="2026-02-27 01:03:13.407830118 +0000 UTC m=+3462.665369702" watchObservedRunningTime="2026-02-27 01:03:13.416370924 +0000 UTC m=+3462.673910488" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.325693 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.326571 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.375989 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.481686 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:17 crc kubenswrapper[4781]: I0227 01:03:17.617961 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:19 crc kubenswrapper[4781]: I0227 01:03:19.446936 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdj4m" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" containerID="cri-o://1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" gracePeriod=2 Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.026997 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.097703 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.097777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.098044 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") pod \"265ff10b-4377-40a2-af31-17901aa730b7\" (UID: \"265ff10b-4377-40a2-af31-17901aa730b7\") " Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.099352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities" (OuterVolumeSpecName: "utilities") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.115038 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49" (OuterVolumeSpecName: "kube-api-access-t8q49") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "kube-api-access-t8q49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.205443 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.205528 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8q49\" (UniqueName: \"kubernetes.io/projected/265ff10b-4377-40a2-af31-17901aa730b7-kube-api-access-t8q49\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.283748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265ff10b-4377-40a2-af31-17901aa730b7" (UID: "265ff10b-4377-40a2-af31-17901aa730b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.307868 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265ff10b-4377-40a2-af31-17901aa730b7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.459989 4781 generic.go:334] "Generic (PLEG): container finished" podID="265ff10b-4377-40a2-af31-17901aa730b7" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" exitCode=0 Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460063 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdj4m" event={"ID":"265ff10b-4377-40a2-af31-17901aa730b7","Type":"ContainerDied","Data":"2198efaa880f708ce961cdfd28d8b51bfa029ee0f97064cd5f306fafb07ec0f2"} Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460173 4781 scope.go:117] "RemoveContainer" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.460198 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdj4m" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.482862 4781 scope.go:117] "RemoveContainer" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.513072 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.515359 4781 scope.go:117] "RemoveContainer" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.526990 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdj4m"] Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567138 4781 scope.go:117] "RemoveContainer" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.567811 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": container with ID starting with 1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e not found: ID does not exist" containerID="1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567904 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e"} err="failed to get container status \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": rpc error: code = NotFound desc = could not find container \"1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e\": container with ID starting with 1a7b53235f869955b24dbc099aa23073ae74427d63b594dbcfa42cc7dd61f71e not found: ID does not exist" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.567941 4781 scope.go:117] "RemoveContainer" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.568869 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": container with ID starting with bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd not found: ID does not exist" containerID="bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.568915 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd"} err="failed to get container status \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": rpc error: code = NotFound desc = could not find container \"bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd\": container with ID starting with bd5f95d35a7249c23685e6b8da1f3e1de7a3276d4acf01e3d6ad32b53e327bcd not found: ID does not exist" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.568947 4781 scope.go:117] "RemoveContainer" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: E0227 01:03:20.569297 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": container with ID starting with 786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae not found: ID does not exist" containerID="786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae" Feb 27 01:03:20 crc kubenswrapper[4781]: I0227 01:03:20.569372 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae"} err="failed to get container status \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": rpc error: code = NotFound desc = could not find container \"786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae\": container with ID starting with 786d5655f773cf7a8adee6ac265fc020dec743fcc35d87b23bd0565d18c711ae not found: ID does not exist" Feb 27 01:03:21 crc kubenswrapper[4781]: I0227 01:03:21.321964 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265ff10b-4377-40a2-af31-17901aa730b7" path="/var/lib/kubelet/pods/265ff10b-4377-40a2-af31-17901aa730b7/volumes" Feb 27 01:03:22 crc kubenswrapper[4781]: I0227 01:03:22.309783 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:22 crc kubenswrapper[4781]: E0227 01:03:22.310309 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:37 crc kubenswrapper[4781]: I0227 01:03:37.310328 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:37 crc kubenswrapper[4781]: E0227 01:03:37.311285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:03:48 crc kubenswrapper[4781]: I0227 01:03:48.309753 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:03:48 crc kubenswrapper[4781]: I0227 01:03:48.729970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.152816 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153779 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-content" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153798 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-content" Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153819 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153827 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: E0227 01:04:00.153867 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-utilities" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.153876 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="extract-utilities" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.154176 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="265ff10b-4377-40a2-af31-17901aa730b7" containerName="registry-server" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.155353 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.157547 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.158198 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.158967 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.165750 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.265797 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.368317 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.391975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"auto-csr-approver-29535904-lpvnq\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.480206 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:00 crc kubenswrapper[4781]: I0227 01:04:00.986532 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:04:01 crc kubenswrapper[4781]: I0227 01:04:01.860612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerStarted","Data":"bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178"} Feb 27 01:04:02 crc kubenswrapper[4781]: I0227 01:04:02.871667 4781 generic.go:334] "Generic (PLEG): container finished" podID="f8015524-a32f-427b-a5a9-08f1d2257259" containerID="7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8" exitCode=0 Feb 27 01:04:02 crc kubenswrapper[4781]: I0227 01:04:02.871746 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerDied","Data":"7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8"} Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.306667 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.349477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") pod \"f8015524-a32f-427b-a5a9-08f1d2257259\" (UID: \"f8015524-a32f-427b-a5a9-08f1d2257259\") " Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.361826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg" (OuterVolumeSpecName: "kube-api-access-pq4rg") pod "f8015524-a32f-427b-a5a9-08f1d2257259" (UID: "f8015524-a32f-427b-a5a9-08f1d2257259"). InnerVolumeSpecName "kube-api-access-pq4rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.452268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4rg\" (UniqueName: \"kubernetes.io/projected/f8015524-a32f-427b-a5a9-08f1d2257259-kube-api-access-pq4rg\") on node \"crc\" DevicePath \"\"" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" event={"ID":"f8015524-a32f-427b-a5a9-08f1d2257259","Type":"ContainerDied","Data":"bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178"} Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892085 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbfa9c982760cf326a62e9c25967f50c4731c438113a0c7dd6b72a5030919178" Feb 27 01:04:04 crc kubenswrapper[4781]: I0227 01:04:04.892150 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535904-lpvnq" Feb 27 01:04:05 crc kubenswrapper[4781]: I0227 01:04:05.376746 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 01:04:05 crc kubenswrapper[4781]: I0227 01:04:05.387861 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535898-vpdkx"] Feb 27 01:04:07 crc kubenswrapper[4781]: I0227 01:04:07.321482 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b518ad5e-0994-4767-9c6d-d2ca11998a43" path="/var/lib/kubelet/pods/b518ad5e-0994-4767-9c6d-d2ca11998a43/volumes" Feb 27 01:04:32 crc kubenswrapper[4781]: I0227 01:04:32.567060 4781 scope.go:117] "RemoveContainer" containerID="5f9790a75567a30dcdf46b8e6f6e9baff3953d885f3c6f58834afe7ab39768fd" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.149262 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: E0227 01:06:00.150304 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.150320 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.150567 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" containerName="oc" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.151355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.156174 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.156219 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.160912 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.164495 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.283822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.386301 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.405680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"auto-csr-approver-29535906-d594j\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.471083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.943031 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:06:00 crc kubenswrapper[4781]: I0227 01:06:00.947843 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:06:01 crc kubenswrapper[4781]: I0227 01:06:01.223739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerStarted","Data":"b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016"} Feb 27 01:06:02 crc kubenswrapper[4781]: I0227 01:06:02.236085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerStarted","Data":"b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41"} Feb 27 01:06:02 crc kubenswrapper[4781]: I0227 01:06:02.255821 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535906-d594j" podStartSLOduration=1.314092769 podStartE2EDuration="2.255802838s" podCreationTimestamp="2026-02-27 01:06:00 +0000 UTC" firstStartedPulling="2026-02-27 01:06:00.947540685 +0000 UTC m=+3630.205080239" lastFinishedPulling="2026-02-27 01:06:01.889250754 +0000 UTC m=+3631.146790308" observedRunningTime="2026-02-27 01:06:02.248088594 +0000 UTC m=+3631.505628148" watchObservedRunningTime="2026-02-27 01:06:02.255802838 +0000 UTC m=+3631.513342392" Feb 27 01:06:03 crc kubenswrapper[4781]: I0227 01:06:03.249592 4781 generic.go:334] "Generic (PLEG): container finished" podID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerID="b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41" exitCode=0 Feb 27 01:06:03 crc kubenswrapper[4781]: I0227 01:06:03.249667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerDied","Data":"b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41"} Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.682798 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.773469 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") pod \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\" (UID: \"dab64a02-9142-4b6f-95c2-1e3805ef62fc\") " Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.780899 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt" (OuterVolumeSpecName: "kube-api-access-cl5jt") pod "dab64a02-9142-4b6f-95c2-1e3805ef62fc" (UID: "dab64a02-9142-4b6f-95c2-1e3805ef62fc"). InnerVolumeSpecName "kube-api-access-cl5jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:04 crc kubenswrapper[4781]: I0227 01:06:04.876237 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5jt\" (UniqueName: \"kubernetes.io/projected/dab64a02-9142-4b6f-95c2-1e3805ef62fc-kube-api-access-cl5jt\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535906-d594j" event={"ID":"dab64a02-9142-4b6f-95c2-1e3805ef62fc","Type":"ContainerDied","Data":"b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016"} Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269680 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b674b21584f06bb732d8c99e018808b988bcda4f3eab8d21a9b2f326fbb0e016" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.269688 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535906-d594j" Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.747363 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:06:05 crc kubenswrapper[4781]: I0227 01:06:05.756784 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535900-6n6zr"] Feb 27 01:06:07 crc kubenswrapper[4781]: I0227 01:06:07.322106 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aac78d6-5f5c-4b48-95f2-554353abcdd3" path="/var/lib/kubelet/pods/0aac78d6-5f5c-4b48-95f2-554353abcdd3/volumes" Feb 27 01:06:12 crc kubenswrapper[4781]: I0227 01:06:12.895793 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:06:12 crc kubenswrapper[4781]: I0227 01:06:12.896348 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:06:32 crc kubenswrapper[4781]: I0227 01:06:32.664931 4781 scope.go:117] "RemoveContainer" containerID="fb76bcf8730e0171831c959b0a00779c7b469f5264f4c1f6152625c6f8db5a04" Feb 27 01:06:42 crc kubenswrapper[4781]: I0227 01:06:42.896001 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:06:42 crc kubenswrapper[4781]: I0227 01:06:42.896549 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896057 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896750 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.896810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.897713 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:07:12 crc kubenswrapper[4781]: I0227 01:07:12.897775 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" gracePeriod=600 Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975278 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" exitCode=0 Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975368 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91"} Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975906 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} Feb 27 01:07:13 crc kubenswrapper[4781]: I0227 01:07:13.975936 4781 scope.go:117] "RemoveContainer" containerID="af35a99be11f3c14d545a455700a19c94a345e55043527241027ffde9cbdb5a3" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.588682 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:46 crc kubenswrapper[4781]: E0227 01:07:46.590498 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.590522 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.590847 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" containerName="oc" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.592117 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595282 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s299" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.595451 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.600839 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.605551 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.628958 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629021 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629447 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.629890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.731532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.731930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732450 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.732492 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.733780 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.734178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.739235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.739734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.741142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.751979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.762862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " pod="openstack/tempest-tests-tempest" Feb 27 01:07:46 crc kubenswrapper[4781]: I0227 01:07:46.925350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:07:47 crc kubenswrapper[4781]: I0227 01:07:47.390794 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:07:48 crc kubenswrapper[4781]: I0227 01:07:48.306816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerStarted","Data":"09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9"} Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.161741 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.164096 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167413 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167740 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.167877 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.177260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.333505 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.436928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.459257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"auto-csr-approver-29535908-lzshf\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:00 crc kubenswrapper[4781]: I0227 01:08:00.493759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.112000 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.113691 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dj679,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(2cc23bf5-7773-4d33-b2be-2ee2a807f086): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.115772 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" Feb 27 01:08:17 crc kubenswrapper[4781]: E0227 01:08:17.647925 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" Feb 27 01:08:17 crc kubenswrapper[4781]: I0227 01:08:17.655034 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:08:18 crc kubenswrapper[4781]: I0227 01:08:18.656947 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerStarted","Data":"10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f"} Feb 27 01:08:19 crc kubenswrapper[4781]: I0227 01:08:19.666810 4781 generic.go:334] "Generic (PLEG): container finished" podID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerID="d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79" exitCode=0 Feb 27 01:08:19 crc kubenswrapper[4781]: I0227 01:08:19.666918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerDied","Data":"d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79"} Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.083470 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.198549 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") pod \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\" (UID: \"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0\") " Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.205600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5" (OuterVolumeSpecName: "kube-api-access-gx5f5") pod "f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" (UID: "f6cd4500-04f9-471d-8c27-2ce1b03fa4f0"). InnerVolumeSpecName "kube-api-access-gx5f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.302293 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5f5\" (UniqueName: \"kubernetes.io/projected/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0-kube-api-access-gx5f5\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-lzshf" event={"ID":"f6cd4500-04f9-471d-8c27-2ce1b03fa4f0","Type":"ContainerDied","Data":"10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f"} Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688518 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a940d58a4c252f52d537dcb72a4d88359f6729136d8f994c4322e0b99dc05f" Feb 27 01:08:21 crc kubenswrapper[4781]: I0227 01:08:21.688565 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-lzshf" Feb 27 01:08:22 crc kubenswrapper[4781]: I0227 01:08:22.156995 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:08:22 crc kubenswrapper[4781]: I0227 01:08:22.166580 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535902-bkgwj"] Feb 27 01:08:23 crc kubenswrapper[4781]: I0227 01:08:23.323717 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da201e7f-72da-4998-8ecb-98a8814f423d" path="/var/lib/kubelet/pods/da201e7f-72da-4998-8ecb-98a8814f423d/volumes" Feb 27 01:08:31 crc kubenswrapper[4781]: I0227 01:08:31.745381 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.770588 4781 scope.go:117] "RemoveContainer" containerID="cba817e11e179b47fa5e55d89f7bb6242121790488edf6a29e663a57c82230bd" Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.813587 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerStarted","Data":"4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5"} Feb 27 01:08:32 crc kubenswrapper[4781]: I0227 01:08:32.841126 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.495126001 podStartE2EDuration="47.841101987s" podCreationTimestamp="2026-02-27 01:07:45 +0000 UTC" firstStartedPulling="2026-02-27 01:07:47.395739584 +0000 UTC m=+3736.653279138" lastFinishedPulling="2026-02-27 01:08:31.74171557 +0000 UTC m=+3780.999255124" observedRunningTime="2026-02-27 01:08:32.839206277 +0000 UTC m=+3782.096745841" watchObservedRunningTime="2026-02-27 01:08:32.841101987 +0000 UTC m=+3782.098641561" Feb 27 01:09:42 crc kubenswrapper[4781]: I0227 01:09:42.895055 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:09:42 crc kubenswrapper[4781]: I0227 01:09:42.895799 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.161968 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:00 crc kubenswrapper[4781]: E0227 01:10:00.163133 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.163149 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.163415 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" containerName="oc" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.164516 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.167387 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.167724 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.168144 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.175572 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.265885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.368687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.398025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"auto-csr-approver-29535910-zxgrs\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.489032 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:00 crc kubenswrapper[4781]: I0227 01:10:00.992755 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:10:01 crc kubenswrapper[4781]: I0227 01:10:01.646773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerStarted","Data":"6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06"} Feb 27 01:10:03 crc kubenswrapper[4781]: I0227 01:10:03.670578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerStarted","Data":"2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e"} Feb 27 01:10:03 crc kubenswrapper[4781]: I0227 01:10:03.697442 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" podStartSLOduration=2.101965146 podStartE2EDuration="3.697423411s" podCreationTimestamp="2026-02-27 01:10:00 +0000 UTC" firstStartedPulling="2026-02-27 01:10:00.994507304 +0000 UTC m=+3870.252046858" lastFinishedPulling="2026-02-27 01:10:02.589965569 +0000 UTC m=+3871.847505123" observedRunningTime="2026-02-27 01:10:03.686985704 +0000 UTC m=+3872.944525248" watchObservedRunningTime="2026-02-27 01:10:03.697423411 +0000 UTC m=+3872.954962965" Feb 27 01:10:04 crc kubenswrapper[4781]: I0227 01:10:04.682659 4781 generic.go:334] "Generic (PLEG): container finished" podID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerID="2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e" exitCode=0 Feb 27 01:10:04 crc kubenswrapper[4781]: I0227 01:10:04.682858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerDied","Data":"2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e"} Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.352263 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.502363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") pod \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\" (UID: \"b5688def-e560-413e-8be5-1d2cfd7e7b4b\") " Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.508725 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8" (OuterVolumeSpecName: "kube-api-access-rsqk8") pod "b5688def-e560-413e-8be5-1d2cfd7e7b4b" (UID: "b5688def-e560-413e-8be5-1d2cfd7e7b4b"). InnerVolumeSpecName "kube-api-access-rsqk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.604840 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsqk8\" (UniqueName: \"kubernetes.io/projected/b5688def-e560-413e-8be5-1d2cfd7e7b4b-kube-api-access-rsqk8\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" event={"ID":"b5688def-e560-413e-8be5-1d2cfd7e7b4b","Type":"ContainerDied","Data":"6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06"} Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705546 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0241584b0c5ee639c2e96362f32b966983ca1ca8d046d5c36ce5fbdc167f06" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.705543 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-zxgrs" Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.775192 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:10:06 crc kubenswrapper[4781]: I0227 01:10:06.784348 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535904-lpvnq"] Feb 27 01:10:07 crc kubenswrapper[4781]: I0227 01:10:07.325529 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8015524-a32f-427b-a5a9-08f1d2257259" path="/var/lib/kubelet/pods/f8015524-a32f-427b-a5a9-08f1d2257259/volumes" Feb 27 01:10:12 crc kubenswrapper[4781]: I0227 01:10:12.895902 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:10:12 crc kubenswrapper[4781]: I0227 01:10:12.896404 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:32 crc kubenswrapper[4781]: I0227 01:10:32.897119 4781 scope.go:117] "RemoveContainer" containerID="7727ecd4b6ab2c57f71f74adfa530ee79124f2b2f80dab2ef9d287684b1949a8" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896237 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896898 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.896952 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.897908 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:10:42 crc kubenswrapper[4781]: I0227 01:10:42.897971 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" gracePeriod=600 Feb 27 01:10:43 crc kubenswrapper[4781]: E0227 01:10:43.022675 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054293 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" exitCode=0 Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5"} Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.054445 4781 scope.go:117] "RemoveContainer" containerID="19e186a1d63090ffda2bb27999feb897d50891041c0f8dac4c6ddf6ef96ddf91" Feb 27 01:10:43 crc kubenswrapper[4781]: I0227 01:10:43.055238 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:10:43 crc kubenswrapper[4781]: E0227 01:10:43.055542 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:10:55 crc kubenswrapper[4781]: I0227 01:10:55.315852 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:10:55 crc kubenswrapper[4781]: E0227 01:10:55.316613 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:07 crc kubenswrapper[4781]: I0227 01:11:07.310142 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:07 crc kubenswrapper[4781]: E0227 01:11:07.311077 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.195656 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:18 crc kubenswrapper[4781]: E0227 01:11:18.196937 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.196954 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.197197 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" containerName="oc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.199030 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.206453 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254049 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.254616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.310950 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:18 crc kubenswrapper[4781]: E0227 01:11:18.311510 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.356856 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.357140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.376689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"redhat-operators-ggngc\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:18 crc kubenswrapper[4781]: I0227 01:11:18.534266 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.044303 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420131 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2" exitCode=0 Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2"} Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.420463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5d9f1a5f9bea6c2522bc7d5eae4266a7c1dd56573912e3c0e29be855c9bd30fa"} Feb 27 01:11:19 crc kubenswrapper[4781]: I0227 01:11:19.421881 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:11:20 crc kubenswrapper[4781]: I0227 01:11:20.432103 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867"} Feb 27 01:11:26 crc kubenswrapper[4781]: I0227 01:11:26.621354 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867" exitCode=0 Feb 27 01:11:26 crc kubenswrapper[4781]: I0227 01:11:26.621463 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867"} Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.170818 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.173602 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.202678 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234859 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.234894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337337 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.337380 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.339381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.339471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.379312 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"redhat-marketplace-ncmpc\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.498887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.642989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerStarted","Data":"5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd"} Feb 27 01:11:27 crc kubenswrapper[4781]: I0227 01:11:27.672053 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ggngc" podStartSLOduration=2.095321715 podStartE2EDuration="9.671617633s" podCreationTimestamp="2026-02-27 01:11:18 +0000 UTC" firstStartedPulling="2026-02-27 01:11:19.421687611 +0000 UTC m=+3948.679227165" lastFinishedPulling="2026-02-27 01:11:26.997983529 +0000 UTC m=+3956.255523083" observedRunningTime="2026-02-27 01:11:27.663178978 +0000 UTC m=+3956.920718532" watchObservedRunningTime="2026-02-27 01:11:27.671617633 +0000 UTC m=+3956.929157197" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.006269 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:28 crc kubenswrapper[4781]: W0227 01:11:28.008951 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858cd97e_43e3_45ce_be89_d6da5a51aac7.slice/crio-40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe WatchSource:0}: Error finding container 40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe: Status 404 returned error can't find the container with id 40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.535107 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.535472 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.653247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} Feb 27 01:11:28 crc kubenswrapper[4781]: I0227 01:11:28.653304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe"} Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.309663 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:29 crc kubenswrapper[4781]: E0227 01:11:29.310231 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.582874 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:29 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:29 crc kubenswrapper[4781]: > Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.669400 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" exitCode=0 Feb 27 01:11:29 crc kubenswrapper[4781]: I0227 01:11:29.669445 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} Feb 27 01:11:30 crc kubenswrapper[4781]: I0227 01:11:30.680774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} Feb 27 01:11:32 crc kubenswrapper[4781]: I0227 01:11:32.704798 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" exitCode=0 Feb 27 01:11:32 crc kubenswrapper[4781]: I0227 01:11:32.704903 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} Feb 27 01:11:33 crc kubenswrapper[4781]: I0227 01:11:33.726775 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerStarted","Data":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} Feb 27 01:11:33 crc kubenswrapper[4781]: I0227 01:11:33.786723 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncmpc" podStartSLOduration=3.347529622 podStartE2EDuration="6.786702413s" podCreationTimestamp="2026-02-27 01:11:27 +0000 UTC" firstStartedPulling="2026-02-27 01:11:29.672002658 +0000 UTC m=+3958.929542212" lastFinishedPulling="2026-02-27 01:11:33.111175449 +0000 UTC m=+3962.368715003" observedRunningTime="2026-02-27 01:11:33.779385428 +0000 UTC m=+3963.036924972" watchObservedRunningTime="2026-02-27 01:11:33.786702413 +0000 UTC m=+3963.044241967" Feb 27 01:11:37 crc kubenswrapper[4781]: I0227 01:11:37.499219 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:37 crc kubenswrapper[4781]: I0227 01:11:37.499868 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:38 crc kubenswrapper[4781]: I0227 01:11:38.562862 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ncmpc" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:38 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:38 crc kubenswrapper[4781]: > Feb 27 01:11:39 crc kubenswrapper[4781]: I0227 01:11:39.588921 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" probeResult="failure" output=< Feb 27 01:11:39 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Feb 27 01:11:39 crc kubenswrapper[4781]: > Feb 27 01:11:43 crc kubenswrapper[4781]: I0227 01:11:43.312456 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:43 crc kubenswrapper[4781]: E0227 01:11:43.314297 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.549554 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.604342 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:47 crc kubenswrapper[4781]: I0227 01:11:47.812018 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.588724 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.640391 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:48 crc kubenswrapper[4781]: I0227 01:11:48.929325 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncmpc" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" containerID="cri-o://b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" gracePeriod=2 Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.725121 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.885956 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886471 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") pod \"858cd97e-43e3-45ce-be89-d6da5a51aac7\" (UID: \"858cd97e-43e3-45ce-be89-d6da5a51aac7\") " Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.886554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities" (OuterVolumeSpecName: "utilities") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.887216 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.892905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7" (OuterVolumeSpecName: "kube-api-access-vgxc7") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "kube-api-access-vgxc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.912215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858cd97e-43e3-45ce-be89-d6da5a51aac7" (UID: "858cd97e-43e3-45ce-be89-d6da5a51aac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944214 4781 generic.go:334] "Generic (PLEG): container finished" podID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" exitCode=0 Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmpc" event={"ID":"858cd97e-43e3-45ce-be89-d6da5a51aac7","Type":"ContainerDied","Data":"40372cb860e2866b26af4ce398fbba0bee8e1e438dae95e85433fd83b1b549fe"} Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944356 4781 scope.go:117] "RemoveContainer" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.944597 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmpc" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.980044 4781 scope.go:117] "RemoveContainer" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.989894 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxc7\" (UniqueName: \"kubernetes.io/projected/858cd97e-43e3-45ce-be89-d6da5a51aac7-kube-api-access-vgxc7\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.989923 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858cd97e-43e3-45ce-be89-d6da5a51aac7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:49 crc kubenswrapper[4781]: I0227 01:11:49.991988 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.005542 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmpc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.021580 4781 scope.go:117] "RemoveContainer" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.057967 4781 scope.go:117] "RemoveContainer" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.058521 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": container with ID starting with b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8 not found: ID does not exist" containerID="b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.058552 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8"} err="failed to get container status \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": rpc error: code = NotFound desc = could not find container \"b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8\": container with ID starting with b48b458f78f610e3316f5abe5febe0be5e6bf15da161b03b3fab76670e747aa8 not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.058591 4781 scope.go:117] "RemoveContainer" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.059020 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": container with ID starting with f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69 not found: ID does not exist" containerID="f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059047 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69"} err="failed to get container status \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": rpc error: code = NotFound desc = could not find container \"f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69\": container with ID starting with f61fceda8ecdf53a108d46e113dba2a351a046647210d151538f477d3ac77d69 not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059060 4781 scope.go:117] "RemoveContainer" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: E0227 01:11:50.059385 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": container with ID starting with 2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb not found: ID does not exist" containerID="2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.059425 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb"} err="failed to get container status \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": rpc error: code = NotFound desc = could not find container \"2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb\": container with ID starting with 2ba2a2165e9383ddd5ff3c78c97979c8874aecde604bc1e9b8d769c6e1571dcb not found: ID does not exist" Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.398071 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.398710 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ggngc" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" containerID="cri-o://5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" gracePeriod=2 Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.961153 4781 generic.go:334] "Generic (PLEG): container finished" podID="9823bd09-bd8f-4565-8437-90af124c41f3" containerID="5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" exitCode=0 Feb 27 01:11:50 crc kubenswrapper[4781]: I0227 01:11:50.961208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd"} Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.151112 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219087 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219328 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") pod \"9823bd09-bd8f-4565-8437-90af124c41f3\" (UID: \"9823bd09-bd8f-4565-8437-90af124c41f3\") " Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.219805 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities" (OuterVolumeSpecName: "utilities") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.220357 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.224896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4" (OuterVolumeSpecName: "kube-api-access-zmjx4") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "kube-api-access-zmjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.322535 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmjx4\" (UniqueName: \"kubernetes.io/projected/9823bd09-bd8f-4565-8437-90af124c41f3-kube-api-access-zmjx4\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.325893 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" path="/var/lib/kubelet/pods/858cd97e-43e3-45ce-be89-d6da5a51aac7/volumes" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.383126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9823bd09-bd8f-4565-8437-90af124c41f3" (UID: "9823bd09-bd8f-4565-8437-90af124c41f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.424987 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9823bd09-bd8f-4565-8437-90af124c41f3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggngc" event={"ID":"9823bd09-bd8f-4565-8437-90af124c41f3","Type":"ContainerDied","Data":"5d9f1a5f9bea6c2522bc7d5eae4266a7c1dd56573912e3c0e29be855c9bd30fa"} Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977804 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggngc" Feb 27 01:11:51 crc kubenswrapper[4781]: I0227 01:11:51.977828 4781 scope.go:117] "RemoveContainer" containerID="5401633aabeaa5cbe1470b20fd3bcb61c3185700eaa796901f5a2bf04d10b7fd" Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.015092 4781 scope.go:117] "RemoveContainer" containerID="5953fda994cd7ac67e94d751a7a40e33b594ebe74d39bf82a391b26ac6914867" Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.017527 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.035431 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ggngc"] Feb 27 01:11:52 crc kubenswrapper[4781]: I0227 01:11:52.054860 4781 scope.go:117] "RemoveContainer" containerID="cd87c8220ae390b7d57fc9d6d38a9e53b68e245d14b3b9b15d787a819cfb9cd2" Feb 27 01:11:53 crc kubenswrapper[4781]: I0227 01:11:53.322668 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" path="/var/lib/kubelet/pods/9823bd09-bd8f-4565-8437-90af124c41f3/volumes" Feb 27 01:11:55 crc kubenswrapper[4781]: I0227 01:11:55.309612 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:11:55 crc kubenswrapper[4781]: E0227 01:11:55.310586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.150531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151513 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151526 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151548 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151556 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151569 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-utilities" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151585 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151591 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151601 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151607 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: E0227 01:12:00.151651 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151657 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="extract-content" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151858 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="858cd97e-43e3-45ce-be89-d6da5a51aac7" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.151880 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9823bd09-bd8f-4565-8437-90af124c41f3" containerName="registry-server" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.152884 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156290 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.156752 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.165205 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.207802 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.310418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.336372 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"auto-csr-approver-29535912-gtvv9\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:00 crc kubenswrapper[4781]: I0227 01:12:00.475020 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:01 crc kubenswrapper[4781]: I0227 01:12:01.028781 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:12:01 crc kubenswrapper[4781]: I0227 01:12:01.063484 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerStarted","Data":"ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa"} Feb 27 01:12:03 crc kubenswrapper[4781]: I0227 01:12:03.083257 4781 generic.go:334] "Generic (PLEG): container finished" podID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerID="8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282" exitCode=0 Feb 27 01:12:03 crc kubenswrapper[4781]: I0227 01:12:03.083331 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerDied","Data":"8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282"} Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.809880 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.883336 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") pod \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\" (UID: \"95d6c94d-6b4e-4d64-8c67-eb43c03187c2\") " Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.893162 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8" (OuterVolumeSpecName: "kube-api-access-n4dd8") pod "95d6c94d-6b4e-4d64-8c67-eb43c03187c2" (UID: "95d6c94d-6b4e-4d64-8c67-eb43c03187c2"). InnerVolumeSpecName "kube-api-access-n4dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:04 crc kubenswrapper[4781]: I0227 01:12:04.987975 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4dd8\" (UniqueName: \"kubernetes.io/projected/95d6c94d-6b4e-4d64-8c67-eb43c03187c2-kube-api-access-n4dd8\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.133918 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" event={"ID":"95d6c94d-6b4e-4d64-8c67-eb43c03187c2","Type":"ContainerDied","Data":"ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa"} Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.133958 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea75dc0bb1b7c654c05cb33d20795b28e260254c83b72d580f113ad5a3a0caaa" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.134010 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-gtvv9" Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.917994 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:12:05 crc kubenswrapper[4781]: I0227 01:12:05.926828 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535906-d594j"] Feb 27 01:12:07 crc kubenswrapper[4781]: I0227 01:12:07.323271 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab64a02-9142-4b6f-95c2-1e3805ef62fc" path="/var/lib/kubelet/pods/dab64a02-9142-4b6f-95c2-1e3805ef62fc/volumes" Feb 27 01:12:08 crc kubenswrapper[4781]: I0227 01:12:08.310165 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:08 crc kubenswrapper[4781]: E0227 01:12:08.310533 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:22 crc kubenswrapper[4781]: I0227 01:12:22.309556 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:22 crc kubenswrapper[4781]: E0227 01:12:22.311480 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:32 crc kubenswrapper[4781]: I0227 01:12:32.998012 4781 scope.go:117] "RemoveContainer" containerID="b05e61a8466110a32ab8e96fdf9a1fec0c346bbcf2b136cb6d58c69fbbfe2a41" Feb 27 01:12:37 crc kubenswrapper[4781]: I0227 01:12:37.309978 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:37 crc kubenswrapper[4781]: E0227 01:12:37.311334 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:12:50 crc kubenswrapper[4781]: I0227 01:12:50.309702 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:12:50 crc kubenswrapper[4781]: E0227 01:12:50.310571 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:01 crc kubenswrapper[4781]: I0227 01:13:01.319433 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:01 crc kubenswrapper[4781]: E0227 01:13:01.320286 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.382533 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:02 crc kubenswrapper[4781]: E0227 01:13:02.383100 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.383115 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.383373 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" containerName="oc" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.385612 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.395171 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.550439 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.551171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.551303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.653442 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.653968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.654711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.677649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"certified-operators-sjrw7\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:02 crc kubenswrapper[4781]: I0227 01:13:02.716504 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.308485 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.847898 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" exitCode=0 Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.847955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93"} Feb 27 01:13:03 crc kubenswrapper[4781]: I0227 01:13:03.848384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"7e0a68824449b3bff65757090cd8e9a85ee8a8d6a48a7613ac67a8bd344a423f"} Feb 27 01:13:04 crc kubenswrapper[4781]: I0227 01:13:04.859610 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} Feb 27 01:13:06 crc kubenswrapper[4781]: E0227 01:13:06.167039 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4577ec6_c8bb_4b50_912b_59bedb35c38b.slice/crio-conmon-ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:13:06 crc kubenswrapper[4781]: I0227 01:13:06.877775 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" exitCode=0 Feb 27 01:13:06 crc kubenswrapper[4781]: I0227 01:13:06.877970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.379458 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.382068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.393542 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.556719 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.556892 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.557353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.659515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.659942 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.660756 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:07 crc kubenswrapper[4781]: I0227 01:13:07.804199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"community-operators-574hs\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.022408 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.585063 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.912286 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.912333 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"9177c2a726ddba9b8c88cf36289009378b4b28e22fc492abae24f8038b4d1db8"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.915455 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerStarted","Data":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} Feb 27 01:13:08 crc kubenswrapper[4781]: I0227 01:13:08.964406 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjrw7" podStartSLOduration=3.489627225 podStartE2EDuration="6.964381578s" podCreationTimestamp="2026-02-27 01:13:02 +0000 UTC" firstStartedPulling="2026-02-27 01:13:03.850219199 +0000 UTC m=+4053.107758763" lastFinishedPulling="2026-02-27 01:13:07.324973562 +0000 UTC m=+4056.582513116" observedRunningTime="2026-02-27 01:13:08.955321955 +0000 UTC m=+4058.212861529" watchObservedRunningTime="2026-02-27 01:13:08.964381578 +0000 UTC m=+4058.221921132" Feb 27 01:13:09 crc kubenswrapper[4781]: I0227 01:13:09.929273 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" exitCode=0 Feb 27 01:13:09 crc kubenswrapper[4781]: I0227 01:13:09.929341 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} Feb 27 01:13:10 crc kubenswrapper[4781]: I0227 01:13:10.941513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.717773 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.718140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:12 crc kubenswrapper[4781]: I0227 01:13:12.768985 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:13 crc kubenswrapper[4781]: I0227 01:13:13.971816 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" exitCode=0 Feb 27 01:13:13 crc kubenswrapper[4781]: I0227 01:13:13.971897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} Feb 27 01:13:14 crc kubenswrapper[4781]: I0227 01:13:14.309495 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:14 crc kubenswrapper[4781]: E0227 01:13:14.309970 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:14 crc kubenswrapper[4781]: I0227 01:13:14.984048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerStarted","Data":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} Feb 27 01:13:15 crc kubenswrapper[4781]: I0227 01:13:15.006437 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-574hs" podStartSLOduration=3.49755621 podStartE2EDuration="8.006420502s" podCreationTimestamp="2026-02-27 01:13:07 +0000 UTC" firstStartedPulling="2026-02-27 01:13:09.931434143 +0000 UTC m=+4059.188973697" lastFinishedPulling="2026-02-27 01:13:14.440298435 +0000 UTC m=+4063.697837989" observedRunningTime="2026-02-27 01:13:15.001257924 +0000 UTC m=+4064.258797488" watchObservedRunningTime="2026-02-27 01:13:15.006420502 +0000 UTC m=+4064.263960056" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.023606 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.024278 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:18 crc kubenswrapper[4781]: I0227 01:13:18.124803 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:19 crc kubenswrapper[4781]: I0227 01:13:19.071167 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:19 crc kubenswrapper[4781]: I0227 01:13:19.120412 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.035911 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-574hs" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" containerID="cri-o://2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" gracePeriod=2 Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.824413 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.976136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.976189 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.977098 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities" (OuterVolumeSpecName: "utilities") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.977359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") pod \"57fe38df-608c-474e-b91f-4d744a0cb01f\" (UID: \"57fe38df-608c-474e-b91f-4d744a0cb01f\") " Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.978571 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:21 crc kubenswrapper[4781]: I0227 01:13:21.981709 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf" (OuterVolumeSpecName: "kube-api-access-k9vnf") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "kube-api-access-k9vnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.031070 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57fe38df-608c-474e-b91f-4d744a0cb01f" (UID: "57fe38df-608c-474e-b91f-4d744a0cb01f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048714 4781 generic.go:334] "Generic (PLEG): container finished" podID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" exitCode=0 Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048766 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-574hs" event={"ID":"57fe38df-608c-474e-b91f-4d744a0cb01f","Type":"ContainerDied","Data":"9177c2a726ddba9b8c88cf36289009378b4b28e22fc492abae24f8038b4d1db8"} Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048816 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-574hs" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.048823 4781 scope.go:117] "RemoveContainer" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.080679 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fe38df-608c-474e-b91f-4d744a0cb01f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.080721 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9vnf\" (UniqueName: \"kubernetes.io/projected/57fe38df-608c-474e-b91f-4d744a0cb01f-kube-api-access-k9vnf\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.082028 4781 scope.go:117] "RemoveContainer" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.090597 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.102029 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-574hs"] Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.114355 4781 scope.go:117] "RemoveContainer" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.164446 4781 scope.go:117] "RemoveContainer" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.166230 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": container with ID starting with 2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e not found: ID does not exist" containerID="2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166273 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e"} err="failed to get container status \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": rpc error: code = NotFound desc = could not find container \"2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e\": container with ID starting with 2f492675cc940a7aa817fc1bdc859aea396b1e5b4cf8474b07ad09176b41778e not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166302 4781 scope.go:117] "RemoveContainer" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.166725 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": container with ID starting with db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5 not found: ID does not exist" containerID="db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166834 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5"} err="failed to get container status \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": rpc error: code = NotFound desc = could not find container \"db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5\": container with ID starting with db7cd7a293b8cdf901eb3ed166d8d05e3554d7fc4c8a7af3788277aaaf6e6ad5 not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.166907 4781 scope.go:117] "RemoveContainer" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: E0227 01:13:22.167520 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": container with ID starting with 75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274 not found: ID does not exist" containerID="75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.167546 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274"} err="failed to get container status \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": rpc error: code = NotFound desc = could not find container \"75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274\": container with ID starting with 75a65897bd4f965053153949a05721c10fd94c8f33384c2d1703151fce3f8274 not found: ID does not exist" Feb 27 01:13:22 crc kubenswrapper[4781]: I0227 01:13:22.764305 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:23 crc kubenswrapper[4781]: I0227 01:13:23.325672 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" path="/var/lib/kubelet/pods/57fe38df-608c-474e-b91f-4d744a0cb01f/volumes" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.063195 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.063470 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjrw7" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" containerID="cri-o://80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" gracePeriod=2 Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.779105 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940521 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.940728 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") pod \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\" (UID: \"d4577ec6-c8bb-4b50-912b-59bedb35c38b\") " Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.941405 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities" (OuterVolumeSpecName: "utilities") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.947272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r" (OuterVolumeSpecName: "kube-api-access-lqv4r") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "kube-api-access-lqv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:13:24 crc kubenswrapper[4781]: I0227 01:13:24.996950 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4577ec6-c8bb-4b50-912b-59bedb35c38b" (UID: "d4577ec6-c8bb-4b50-912b-59bedb35c38b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047350 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047408 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4577ec6-c8bb-4b50-912b-59bedb35c38b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.047429 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqv4r\" (UniqueName: \"kubernetes.io/projected/d4577ec6-c8bb-4b50-912b-59bedb35c38b-kube-api-access-lqv4r\") on node \"crc\" DevicePath \"\"" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083180 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" exitCode=0 Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083266 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjrw7" event={"ID":"d4577ec6-c8bb-4b50-912b-59bedb35c38b","Type":"ContainerDied","Data":"7e0a68824449b3bff65757090cd8e9a85ee8a8d6a48a7613ac67a8bd344a423f"} Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083296 4781 scope.go:117] "RemoveContainer" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.083469 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjrw7" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.114101 4781 scope.go:117] "RemoveContainer" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.137691 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.142002 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjrw7"] Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.158238 4781 scope.go:117] "RemoveContainer" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.197640 4781 scope.go:117] "RemoveContainer" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.198193 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": container with ID starting with 80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea not found: ID does not exist" containerID="80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198236 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea"} err="failed to get container status \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": rpc error: code = NotFound desc = could not find container \"80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea\": container with ID starting with 80df334ac1f9c8dd49c924a81663c4dba8dc136b8eafc805601011231e0bf6ea not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198264 4781 scope.go:117] "RemoveContainer" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.198573 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": container with ID starting with ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9 not found: ID does not exist" containerID="ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198600 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9"} err="failed to get container status \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": rpc error: code = NotFound desc = could not find container \"ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9\": container with ID starting with ab24539bd64219aad8d6d754cd6bad7a9532d381150c4d7518b6ecdc63d439f9 not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.198614 4781 scope.go:117] "RemoveContainer" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: E0227 01:13:25.199009 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": container with ID starting with 547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93 not found: ID does not exist" containerID="547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.199043 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93"} err="failed to get container status \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": rpc error: code = NotFound desc = could not find container \"547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93\": container with ID starting with 547e058f8b95f44d466aa7216dd85685b12a0e2d69dd613d2084d6e362d25b93 not found: ID does not exist" Feb 27 01:13:25 crc kubenswrapper[4781]: I0227 01:13:25.320333 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" path="/var/lib/kubelet/pods/d4577ec6-c8bb-4b50-912b-59bedb35c38b/volumes" Feb 27 01:13:28 crc kubenswrapper[4781]: I0227 01:13:28.309216 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:28 crc kubenswrapper[4781]: E0227 01:13:28.310065 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:41 crc kubenswrapper[4781]: I0227 01:13:41.316848 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:41 crc kubenswrapper[4781]: E0227 01:13:41.317525 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:13:52 crc kubenswrapper[4781]: I0227 01:13:52.310117 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:13:52 crc kubenswrapper[4781]: E0227 01:13:52.311175 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.177531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178337 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178354 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178386 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178394 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178410 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178418 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178433 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178439 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178454 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178461 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="extract-utilities" Feb 27 01:14:00 crc kubenswrapper[4781]: E0227 01:14:00.178476 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="extract-content" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178716 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4577ec6-c8bb-4b50-912b-59bedb35c38b" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.178735 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fe38df-608c-474e-b91f-4d744a0cb01f" containerName="registry-server" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.180778 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.187425 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.187542 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.188519 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.207310 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.279102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.381762 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.402463 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"auto-csr-approver-29535914-n4hsg\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.511108 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:00 crc kubenswrapper[4781]: I0227 01:14:00.968652 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:14:01 crc kubenswrapper[4781]: I0227 01:14:01.446413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerStarted","Data":"75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9"} Feb 27 01:14:03 crc kubenswrapper[4781]: I0227 01:14:03.474171 4781 generic.go:334] "Generic (PLEG): container finished" podID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerID="b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8" exitCode=0 Feb 27 01:14:03 crc kubenswrapper[4781]: I0227 01:14:03.474418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerDied","Data":"b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8"} Feb 27 01:14:04 crc kubenswrapper[4781]: I0227 01:14:04.310291 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:04 crc kubenswrapper[4781]: E0227 01:14:04.311035 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.101486 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.184849 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") pod \"2afc6c2c-4602-4819-bb62-46008ced90dc\" (UID: \"2afc6c2c-4602-4819-bb62-46008ced90dc\") " Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.191715 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7" (OuterVolumeSpecName: "kube-api-access-zzhz7") pod "2afc6c2c-4602-4819-bb62-46008ced90dc" (UID: "2afc6c2c-4602-4819-bb62-46008ced90dc"). InnerVolumeSpecName "kube-api-access-zzhz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.287482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzhz7\" (UniqueName: \"kubernetes.io/projected/2afc6c2c-4602-4819-bb62-46008ced90dc-kube-api-access-zzhz7\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" event={"ID":"2afc6c2c-4602-4819-bb62-46008ced90dc","Type":"ContainerDied","Data":"75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9"} Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496215 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a382a9d1c98e2e384eb6e738f7fc24a346a404652db42c44f0e5096954bfa9" Feb 27 01:14:05 crc kubenswrapper[4781]: I0227 01:14:05.496530 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-n4hsg" Feb 27 01:14:06 crc kubenswrapper[4781]: I0227 01:14:06.172151 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:14:06 crc kubenswrapper[4781]: I0227 01:14:06.183229 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-lzshf"] Feb 27 01:14:07 crc kubenswrapper[4781]: I0227 01:14:07.319900 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cd4500-04f9-471d-8c27-2ce1b03fa4f0" path="/var/lib/kubelet/pods/f6cd4500-04f9-471d-8c27-2ce1b03fa4f0/volumes" Feb 27 01:14:16 crc kubenswrapper[4781]: I0227 01:14:16.309516 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:16 crc kubenswrapper[4781]: E0227 01:14:16.310344 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:22 crc kubenswrapper[4781]: I0227 01:14:22.673766 4781 generic.go:334] "Generic (PLEG): container finished" podID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerID="4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5" exitCode=0 Feb 27 01:14:22 crc kubenswrapper[4781]: I0227 01:14:22.673867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerDied","Data":"4ec0cfbe0f662afc3fb53d5da9b369a462851688dbd0c754fa273d9d0f52d0e5"} Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.192081 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294663 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294749 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294912 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.294998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295038 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295114 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295195 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295248 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") pod \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\" (UID: \"2cc23bf5-7773-4d33-b2be-2ee2a807f086\") " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.295946 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data" (OuterVolumeSpecName: "config-data") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.296320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.302378 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679" (OuterVolumeSpecName: "kube-api-access-dj679") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "kube-api-access-dj679". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.302857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.329729 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.333779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.339580 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.356610 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398731 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398779 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398792 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398806 4781 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398852 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398865 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398878 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2cc23bf5-7773-4d33-b2be-2ee2a807f086-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.398890 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj679\" (UniqueName: \"kubernetes.io/projected/2cc23bf5-7773-4d33-b2be-2ee2a807f086-kube-api-access-dj679\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.426203 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.502515 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695804 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2cc23bf5-7773-4d33-b2be-2ee2a807f086","Type":"ContainerDied","Data":"09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9"} Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695857 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09459f242ec2925373f69aa651b16dcc96301f46d456e4eb0a8a401a4473bde9" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.695928 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.713366 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "2cc23bf5-7773-4d33-b2be-2ee2a807f086" (UID: "2cc23bf5-7773-4d33-b2be-2ee2a807f086"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:14:24 crc kubenswrapper[4781]: I0227 01:14:24.807645 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2cc23bf5-7773-4d33-b2be-2ee2a807f086-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:30 crc kubenswrapper[4781]: I0227 01:14:30.309425 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:30 crc kubenswrapper[4781]: E0227 01:14:30.310336 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:33 crc kubenswrapper[4781]: I0227 01:14:33.205652 4781 scope.go:117] "RemoveContainer" containerID="d86455502c8fe2209abff00ea2ac33cb262fd9455f065e8361be2d1baaf2ea79" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.752323 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:36 crc kubenswrapper[4781]: E0227 01:14:36.753436 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753450 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: E0227 01:14:36.753477 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753483 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753684 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" containerName="oc" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.753706 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc23bf5-7773-4d33-b2be-2ee2a807f086" containerName="tempest-tests-tempest-tests-runner" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.754450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.757471 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5s299" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.767608 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.786976 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.787082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.888838 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.888984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.889438 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.909487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82gl\" (UniqueName: \"kubernetes.io/projected/083b0010-19f4-4944-a097-96d20dad7eda-kube-api-access-h82gl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:36 crc kubenswrapper[4781]: I0227 01:14:36.917568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"083b0010-19f4-4944-a097-96d20dad7eda\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.079330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.525974 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 01:14:37 crc kubenswrapper[4781]: I0227 01:14:37.819674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"083b0010-19f4-4944-a097-96d20dad7eda","Type":"ContainerStarted","Data":"5e5618976697cfd1be3a0195ffb7529857496cb0f3f7c03ed2932f110e6b36be"} Feb 27 01:14:41 crc kubenswrapper[4781]: I0227 01:14:41.856897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"083b0010-19f4-4944-a097-96d20dad7eda","Type":"ContainerStarted","Data":"e12987251b06d55d729c13f02f8757c7e587543cbdff707df8570d08de533609"} Feb 27 01:14:41 crc kubenswrapper[4781]: I0227 01:14:41.875576 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.206598773 podStartE2EDuration="5.875549153s" podCreationTimestamp="2026-02-27 01:14:36 +0000 UTC" firstStartedPulling="2026-02-27 01:14:37.520258359 +0000 UTC m=+4146.777797933" lastFinishedPulling="2026-02-27 01:14:41.189208759 +0000 UTC m=+4150.446748313" observedRunningTime="2026-02-27 01:14:41.86757768 +0000 UTC m=+4151.125117234" watchObservedRunningTime="2026-02-27 01:14:41.875549153 +0000 UTC m=+4151.133088707" Feb 27 01:14:44 crc kubenswrapper[4781]: I0227 01:14:44.309852 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:44 crc kubenswrapper[4781]: E0227 01:14:44.310124 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:14:55 crc kubenswrapper[4781]: I0227 01:14:55.309868 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:14:55 crc kubenswrapper[4781]: E0227 01:14:55.310726 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.164020 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.165660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.169910 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.169910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.190915 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.289851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.391733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.391898 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.392047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.393457 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.688152 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.691063 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"collect-profiles-29535915-xmxrj\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:00 crc kubenswrapper[4781]: I0227 01:15:00.803008 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:01 crc kubenswrapper[4781]: I0227 01:15:01.297038 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj"] Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.069910 4781 generic.go:334] "Generic (PLEG): container finished" podID="d1f1363d-33b3-4396-b176-66c221518e82" containerID="45c4167eed5398090c993c0790061c50a6dfea1582f588f9a80a8d848992fe77" exitCode=0 Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.070004 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerDied","Data":"45c4167eed5398090c993c0790061c50a6dfea1582f588f9a80a8d848992fe77"} Feb 27 01:15:02 crc kubenswrapper[4781]: I0227 01:15:02.070269 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerStarted","Data":"399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503"} Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.645846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770358 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770537 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.770571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") pod \"d1f1363d-33b3-4396-b176-66c221518e82\" (UID: \"d1f1363d-33b3-4396-b176-66c221518e82\") " Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.771127 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.776838 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc" (OuterVolumeSpecName: "kube-api-access-s4dhc") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "kube-api-access-s4dhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.776880 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1f1363d-33b3-4396-b176-66c221518e82" (UID: "d1f1363d-33b3-4396-b176-66c221518e82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872836 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1f1363d-33b3-4396-b176-66c221518e82-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872887 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4dhc\" (UniqueName: \"kubernetes.io/projected/d1f1363d-33b3-4396-b176-66c221518e82-kube-api-access-s4dhc\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4781]: I0227 01:15:03.872901 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1f1363d-33b3-4396-b176-66c221518e82-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099193 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" event={"ID":"d1f1363d-33b3-4396-b176-66c221518e82","Type":"ContainerDied","Data":"399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503"} Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099235 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="399cdc1b5315fc83638e6cf2b70c66d1835b56844b28c229f41af4984e90d503" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.099516 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-xmxrj" Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.745838 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 01:15:04 crc kubenswrapper[4781]: I0227 01:15:04.763311 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535870-kzqvl"] Feb 27 01:15:05 crc kubenswrapper[4781]: I0227 01:15:05.320865 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb78ed91-75d4-40d9-9359-da1c3878e145" path="/var/lib/kubelet/pods/eb78ed91-75d4-40d9-9359-da1c3878e145/volumes" Feb 27 01:15:08 crc kubenswrapper[4781]: I0227 01:15:08.311327 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:08 crc kubenswrapper[4781]: E0227 01:15:08.312118 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:22 crc kubenswrapper[4781]: I0227 01:15:22.309676 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:22 crc kubenswrapper[4781]: E0227 01:15:22.310569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:33 crc kubenswrapper[4781]: I0227 01:15:33.310489 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:33 crc kubenswrapper[4781]: E0227 01:15:33.311644 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:15:33 crc kubenswrapper[4781]: I0227 01:15:33.318436 4781 scope.go:117] "RemoveContainer" containerID="d91a97b2a127dcb363e0a68bf8507e044d643d2c3b09f879675dfcd44d75afab" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.039766 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:38 crc kubenswrapper[4781]: E0227 01:15:38.040891 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.040909 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.041153 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f1363d-33b3-4396-b176-66c221518e82" containerName="collect-profiles" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.042465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.044488 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vvzsl"/"openshift-service-ca.crt" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.044541 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vvzsl"/"default-dockercfg-pccr7" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.060692 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vvzsl"/"kube-root-ca.crt" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.061508 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.119078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.119463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.221929 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.222015 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.222471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.247474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"must-gather-b97zf\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.365493 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:15:38 crc kubenswrapper[4781]: I0227 01:15:38.952688 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:15:39 crc kubenswrapper[4781]: I0227 01:15:39.471465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"75213b32388c1fc11d660814864b8f23dbc7e620d603be8385e2ead2c1e70380"} Feb 27 01:15:46 crc kubenswrapper[4781]: I0227 01:15:46.311260 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:15:47 crc kubenswrapper[4781]: I0227 01:15:47.564684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.575923 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.576267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerStarted","Data":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} Feb 27 01:15:48 crc kubenswrapper[4781]: I0227 01:15:48.592948 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vvzsl/must-gather-b97zf" podStartSLOduration=2.197822789 podStartE2EDuration="10.592927406s" podCreationTimestamp="2026-02-27 01:15:38 +0000 UTC" firstStartedPulling="2026-02-27 01:15:38.959964289 +0000 UTC m=+4208.217503843" lastFinishedPulling="2026-02-27 01:15:47.355068906 +0000 UTC m=+4216.612608460" observedRunningTime="2026-02-27 01:15:48.590145442 +0000 UTC m=+4217.847684996" watchObservedRunningTime="2026-02-27 01:15:48.592927406 +0000 UTC m=+4217.850466960" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.799668 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.803140 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.976125 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:52 crc kubenswrapper[4781]: I0227 01:15:52.976737 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.078913 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.079194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.079777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.478336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"crc-debug-ml4c2\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: I0227 01:15:53.724570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:15:53 crc kubenswrapper[4781]: W0227 01:15:53.762155 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2ecbcb_d11e_4803_80d0_cda5c906849b.slice/crio-3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511 WatchSource:0}: Error finding container 3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511: Status 404 returned error can't find the container with id 3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511 Feb 27 01:15:54 crc kubenswrapper[4781]: I0227 01:15:54.695888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerStarted","Data":"3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511"} Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.151000 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.153228 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156139 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156250 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.156268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.164454 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.249804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.352558 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.374286 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"auto-csr-approver-29535916-5rslt\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:00 crc kubenswrapper[4781]: I0227 01:16:00.481157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.884234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerStarted","Data":"5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94"} Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.907454 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" podStartSLOduration=2.377190795 podStartE2EDuration="14.907433524s" podCreationTimestamp="2026-02-27 01:15:52 +0000 UTC" firstStartedPulling="2026-02-27 01:15:53.765280622 +0000 UTC m=+4223.022820176" lastFinishedPulling="2026-02-27 01:16:06.295523351 +0000 UTC m=+4235.553062905" observedRunningTime="2026-02-27 01:16:06.898559356 +0000 UTC m=+4236.156098910" watchObservedRunningTime="2026-02-27 01:16:06.907433524 +0000 UTC m=+4236.164973078" Feb 27 01:16:06 crc kubenswrapper[4781]: I0227 01:16:06.946603 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:16:06 crc kubenswrapper[4781]: W0227 01:16:06.947072 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeca34b2_a27c_46b9_bbe3_ac2d08a7a72e.slice/crio-182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1 WatchSource:0}: Error finding container 182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1: Status 404 returned error can't find the container with id 182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1 Feb 27 01:16:07 crc kubenswrapper[4781]: I0227 01:16:07.898459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerStarted","Data":"182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1"} Feb 27 01:16:08 crc kubenswrapper[4781]: I0227 01:16:08.910139 4781 generic.go:334] "Generic (PLEG): container finished" podID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerID="a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6" exitCode=0 Feb 27 01:16:08 crc kubenswrapper[4781]: I0227 01:16:08.910245 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerDied","Data":"a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6"} Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.579204 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.696303 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") pod \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\" (UID: \"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e\") " Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.703303 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph" (OuterVolumeSpecName: "kube-api-access-2rrph") pod "deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" (UID: "deca34b2-a27c-46b9-bbe3-ac2d08a7a72e"). InnerVolumeSpecName "kube-api-access-2rrph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.799425 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rrph\" (UniqueName: \"kubernetes.io/projected/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e-kube-api-access-2rrph\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935327 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-5rslt" event={"ID":"deca34b2-a27c-46b9-bbe3-ac2d08a7a72e","Type":"ContainerDied","Data":"182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1"} Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935708 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182ecfdc1953621dfcf56b119ca2c33c14ebd6a3b3896b349c359878020790a1" Feb 27 01:16:10 crc kubenswrapper[4781]: I0227 01:16:10.935384 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-5rslt" Feb 27 01:16:11 crc kubenswrapper[4781]: I0227 01:16:11.665241 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:16:11 crc kubenswrapper[4781]: I0227 01:16:11.678441 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-zxgrs"] Feb 27 01:16:13 crc kubenswrapper[4781]: I0227 01:16:13.324976 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5688def-e560-413e-8be5-1d2cfd7e7b4b" path="/var/lib/kubelet/pods/b5688def-e560-413e-8be5-1d2cfd7e7b4b/volumes" Feb 27 01:16:33 crc kubenswrapper[4781]: I0227 01:16:33.411475 4781 scope.go:117] "RemoveContainer" containerID="2f2421387f96858e89c61569a502259afb51c7ee81cb327e3f4310b20461360e" Feb 27 01:17:04 crc kubenswrapper[4781]: I0227 01:17:04.471015 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerID="5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94" exitCode=0 Feb 27 01:17:04 crc kubenswrapper[4781]: I0227 01:17:04.471105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" event={"ID":"ec2ecbcb-d11e-4803-80d0-cda5c906849b","Type":"ContainerDied","Data":"5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94"} Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.605201 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.641653 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.669845 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-ml4c2"] Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.777664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") pod \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.778785 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") pod \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\" (UID: \"ec2ecbcb-d11e-4803-80d0-cda5c906849b\") " Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.778933 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host" (OuterVolumeSpecName: "host") pod "ec2ecbcb-d11e-4803-80d0-cda5c906849b" (UID: "ec2ecbcb-d11e-4803-80d0-cda5c906849b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:05 crc kubenswrapper[4781]: I0227 01:17:05.779551 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2ecbcb-d11e-4803-80d0-cda5c906849b-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.475857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w" (OuterVolumeSpecName: "kube-api-access-c5c6w") pod "ec2ecbcb-d11e-4803-80d0-cda5c906849b" (UID: "ec2ecbcb-d11e-4803-80d0-cda5c906849b"). InnerVolumeSpecName "kube-api-access-c5c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.497142 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5c6w\" (UniqueName: \"kubernetes.io/projected/ec2ecbcb-d11e-4803-80d0-cda5c906849b-kube-api-access-c5c6w\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.502716 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c19b7b6c89391495311645dc7fda7aa50a853b6c0af7a1674ece3c93ebb3511" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.502835 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-ml4c2" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.807691 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:06 crc kubenswrapper[4781]: E0227 01:17:06.808147 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808166 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: E0227 01:17:06.808191 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808202 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808388 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" containerName="oc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.808403 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" containerName="container-00" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.809133 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.905686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:06 crc kubenswrapper[4781]: I0227 01:17:06.905824 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.008878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.030382 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"crc-debug-nbfnc\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.126926 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.321189 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2ecbcb-d11e-4803-80d0-cda5c906849b" path="/var/lib/kubelet/pods/ec2ecbcb-d11e-4803-80d0-cda5c906849b/volumes" Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513604 4781 generic.go:334] "Generic (PLEG): container finished" podID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerID="4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84" exitCode=0 Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" event={"ID":"ef5ae2b7-86af-4272-9ec7-767cfa31836a","Type":"ContainerDied","Data":"4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84"} Feb 27 01:17:07 crc kubenswrapper[4781]: I0227 01:17:07.513687 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" event={"ID":"ef5ae2b7-86af-4272-9ec7-767cfa31836a","Type":"ContainerStarted","Data":"72c5d34b8aad92c4bd90c23fcd563898e50ffb0bd14489ced455f6756ad964e7"} Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.649930 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.737489 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") pod \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738053 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") pod \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\" (UID: \"ef5ae2b7-86af-4272-9ec7-767cfa31836a\") " Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738169 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host" (OuterVolumeSpecName: "host") pod "ef5ae2b7-86af-4272-9ec7-767cfa31836a" (UID: "ef5ae2b7-86af-4272-9ec7-767cfa31836a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.738963 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5ae2b7-86af-4272-9ec7-767cfa31836a-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.745610 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7" (OuterVolumeSpecName: "kube-api-access-4zfk7") pod "ef5ae2b7-86af-4272-9ec7-767cfa31836a" (UID: "ef5ae2b7-86af-4272-9ec7-767cfa31836a"). InnerVolumeSpecName "kube-api-access-4zfk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.840215 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zfk7\" (UniqueName: \"kubernetes.io/projected/ef5ae2b7-86af-4272-9ec7-767cfa31836a-kube-api-access-4zfk7\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:08 crc kubenswrapper[4781]: I0227 01:17:08.984429 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.012026 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-nbfnc"] Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.321772 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" path="/var/lib/kubelet/pods/ef5ae2b7-86af-4272-9ec7-767cfa31836a/volumes" Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.534380 4781 scope.go:117] "RemoveContainer" containerID="4d92a51020b7596d7680b0bc9dcf1180dddc52a240e9d1a8d518dcb39bbffd84" Feb 27 01:17:09 crc kubenswrapper[4781]: I0227 01:17:09.534427 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-nbfnc" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.169999 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:10 crc kubenswrapper[4781]: E0227 01:17:10.171196 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.171214 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.171510 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5ae2b7-86af-4272-9ec7-767cfa31836a" containerName="container-00" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.172414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.273220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.273308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375354 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.375681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.397774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"crc-debug-8tfrx\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: I0227 01:17:10.494059 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:10 crc kubenswrapper[4781]: W0227 01:17:10.543900 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba840bdf_362a_4cad_85e5_3f450bd7f2f5.slice/crio-1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a WatchSource:0}: Error finding container 1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a: Status 404 returned error can't find the container with id 1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.565442 4781 generic.go:334] "Generic (PLEG): container finished" podID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerID="aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea" exitCode=0 Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.566082 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" event={"ID":"ba840bdf-362a-4cad-85e5-3f450bd7f2f5","Type":"ContainerDied","Data":"aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea"} Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.566139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" event={"ID":"ba840bdf-362a-4cad-85e5-3f450bd7f2f5","Type":"ContainerStarted","Data":"1506348ce5416450a0d38efaeef159358e8fab0ed16fcde55a882c178f0e269a"} Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.612481 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:11 crc kubenswrapper[4781]: I0227 01:17:11.622479 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/crc-debug-8tfrx"] Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.687558 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826845 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") pod \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826944 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") pod \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\" (UID: \"ba840bdf-362a-4cad-85e5-3f450bd7f2f5\") " Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.826956 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host" (OuterVolumeSpecName: "host") pod "ba840bdf-362a-4cad-85e5-3f450bd7f2f5" (UID: "ba840bdf-362a-4cad-85e5-3f450bd7f2f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.827521 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-host\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.843868 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7" (OuterVolumeSpecName: "kube-api-access-57gj7") pod "ba840bdf-362a-4cad-85e5-3f450bd7f2f5" (UID: "ba840bdf-362a-4cad-85e5-3f450bd7f2f5"). InnerVolumeSpecName "kube-api-access-57gj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:12 crc kubenswrapper[4781]: I0227 01:17:12.929872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57gj7\" (UniqueName: \"kubernetes.io/projected/ba840bdf-362a-4cad-85e5-3f450bd7f2f5-kube-api-access-57gj7\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.325547 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" path="/var/lib/kubelet/pods/ba840bdf-362a-4cad-85e5-3f450bd7f2f5/volumes" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.587527 4781 scope.go:117] "RemoveContainer" containerID="aa2a575779b6b05c095bd940a68ea17e213f83823d4e072ee160e36b9bfd3fea" Feb 27 01:17:13 crc kubenswrapper[4781]: I0227 01:17:13.587579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/crc-debug-8tfrx" Feb 27 01:17:41 crc kubenswrapper[4781]: I0227 01:17:41.998243 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/init-config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.221799 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/init-config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.227154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/config-reloader/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.249381 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_58009056-4183-4017-bfa1-c14ce28b92ea/alertmanager/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.935350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9fcdb6594-94vkn_582fee51-d9df-4150-b217-889f2f4f8852/barbican-api/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.949212 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fff4854c8-ttzsm_41039943-96a7-4fe6-8b66-0d64cd12a1fa/barbican-keystone-listener/0.log" Feb 27 01:17:42 crc kubenswrapper[4781]: I0227 01:17:42.981765 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9fcdb6594-94vkn_582fee51-d9df-4150-b217-889f2f4f8852/barbican-api-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.170980 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dd7c6f4ff-m4d2l_f92df023-2e4a-495e-bbef-4a043c661f46/barbican-worker/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.233912 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fff4854c8-ttzsm_41039943-96a7-4fe6-8b66-0d64cd12a1fa/barbican-keystone-listener-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.272293 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7dd7c6f4ff-m4d2l_f92df023-2e4a-495e-bbef-4a043c661f46/barbican-worker-log/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.503186 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zgrpp_94c301c2-f624-44a1-ad01-7d60748c5fca/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.626783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/ceilometer-central-agent/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.810489 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/ceilometer-notification-agent/0.log" Feb 27 01:17:43 crc kubenswrapper[4781]: I0227 01:17:43.844294 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/proxy-httpd/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.020605 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4f5736d7-ab3f-41d9-b5ec-94da30e708f1/sg-core/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.086887 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1cb0bf7e-097c-4c30-b0e6-224090588da2/cinder-api-log/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.147783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1cb0bf7e-097c-4c30-b0e6-224090588da2/cinder-api/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.677248 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_16cb4c6c-2ddb-41e0-8db3-f44961445474/probe/0.log" Feb 27 01:17:44 crc kubenswrapper[4781]: I0227 01:17:44.902440 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_16cb4c6c-2ddb-41e0-8db3-f44961445474/cinder-scheduler/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.199954 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_a7ad9523-5281-4d1c-a9d5-92982905d525/cloudkitty-api-log/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.220956 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_42503ae1-b143-45c3-8789-e2d1f72cc335/loki-compactor/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.352493 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_a7ad9523-5281-4d1c-a9d5-92982905d525/cloudkitty-api/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.545391 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-nqbgf_a5170e93-09e9-40d2-ac65-b87d44ceb185/loki-distributor/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.646715 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-bxttl_877c39ec-0202-4987-b6e7-4fb90c4dc9b5/gateway/0.log" Feb 27 01:17:45 crc kubenswrapper[4781]: I0227 01:17:45.735069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-mj87x_233250c8-3871-43ec-8c1d-47bd1d3133e1/gateway/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.124950 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_2691e066-2f4c-4e7e-bcac-01933bd6cadb/loki-ingester/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.167250 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_684ccdab-ae41-466c-bf47-78c3ada41164/loki-index-gateway/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.400269 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-9rp9f_d9e3acc2-cee4-4bfe-af04-3a64041fc327/loki-query-frontend/0.log" Feb 27 01:17:46 crc kubenswrapper[4781]: I0227 01:17:46.814347 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-l7tpg_95533111-b2e6-41c2-b7b8-edc0a82e2ca5/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.092266 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/init/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.135189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qtql5_b05a1d9c-7887-4173-99fe-97f7c89cc555/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.303392 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/init/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.498454 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-whkj4_d71cee9c-2288-4843-ab71-0720c8527073/loki-querier/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.530944 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-9drr8_f01f0f26-7e7a-464f-8f50-4d49bf87cb46/dnsmasq-dns/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.818377 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6zzbs_756e2fbc-556d-44b8-8820-e469ae73ff3b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:47 crc kubenswrapper[4781]: I0227 01:17:47.997260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_409aba7a-466d-40a0-b9bd-7dfd8d81ee4f/glance-log/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.017933 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_409aba7a-466d-40a0-b9bd-7dfd8d81ee4f/glance-httpd/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.207023 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_141465f3-d299-4d9c-a74f-0df5c741e325/glance-httpd/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.262112 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_141465f3-d299-4d9c-a74f-0df5c741e325/glance-log/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.531404 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-fx894_0dace61f-2e30-4132-9ce6-1cb1c8a6cedc/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.663573 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rrxrj_29e8157f-b610-48f3-93ac-9173fa6d484a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:48 crc kubenswrapper[4781]: I0227 01:17:48.929747 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535901-2chr7_8f6a0640-2204-47a2-a550-7a7bb14ebc0d/keystone-cron/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.198825 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_25933928-b136-4b38-955a-46a3d802a62b/kube-state-metrics/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.212731 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56459cf68c-4q7c8_2467458a-476f-460f-a6ce-144d7304476d/keystone-api/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.312249 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_cf4c3569-6860-4c2a-8923-42e436279a11/cloudkitty-proc/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.414247 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pcf7c_bd292468-b151-4004-b0b7-bd873e7e4e2d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.692788 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f5d76fc7-rbhdd_384db6f0-71f1-4926-9e65-5c27eb430325/neutron-api/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.783037 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56f5d76fc7-rbhdd_384db6f0-71f1-4926-9e65-5c27eb430325/neutron-httpd/0.log" Feb 27 01:17:49 crc kubenswrapper[4781]: I0227 01:17:49.977602 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vrkbq_3a3e8437-2d3f-44a9-bb1a-8b3de1e91c87/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.424495 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e258c11-5caa-4d6b-ab77-841ddf83ac81/nova-api-log/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.465297 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7503d0a7-eca6-4d15-9538-9cded970acc2/nova-cell0-conductor-conductor/0.log" Feb 27 01:17:50 crc kubenswrapper[4781]: I0227 01:17:50.766792 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e258c11-5caa-4d6b-ab77-841ddf83ac81/nova-api-api/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.229202 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c8c40a18-7bbd-4d06-8a8a-427de95016fa/nova-cell1-conductor-conductor/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.258941 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a3b399a8-7654-47f3-be04-759080f4f180/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.403947 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ntt4h_d3f8abc3-17b4-4d88-890e-85304a100a97/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.585111 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e32c3573-4acb-4d70-aa6e-2d647c108931/nova-metadata-log/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.841026 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1d7f8c00-d318-4f7d-b67e-6743c3a82dae/nova-scheduler-scheduler/0.log" Feb 27 01:17:51 crc kubenswrapper[4781]: I0227 01:17:51.911112 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/mysql-bootstrap/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.167940 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/mysql-bootstrap/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.225613 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_22624edd-e366-4aff-84dd-c3cec89c0591/galera/0.log" Feb 27 01:17:52 crc kubenswrapper[4781]: I0227 01:17:52.962953 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/mysql-bootstrap/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.026550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e32c3573-4acb-4d70-aa6e-2d647c108931/nova-metadata-metadata/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.265200 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/galera/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.281263 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d59d3864-af0d-407c-8431-ae2e17e4b46f/mysql-bootstrap/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.395932 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_02c4875e-e180-4365-a00a-828ab5d95c34/openstackclient/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.561772 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9zkpb_092921e0-a033-4021-b0f5-9c89de3aa830/ovn-controller/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.695040 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hx85z_cf463d95-25dd-4b99-afb0-dac99157c5fa/openstack-network-exporter/0.log" Feb 27 01:17:53 crc kubenswrapper[4781]: I0227 01:17:53.879682 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server-init/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.096008 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server-init/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.162189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovs-vswitchd/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.165957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hcb9s_9c2c498e-52b1-4ee2-bcf8-3599ee89513c/ovsdb-server/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.477900 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-27lcw_e61bcd0e-2490-4f8e-a429-cf07405dc01b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.488375 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d5923572-3637-49e3-9eea-72e52c5fb88b/openstack-network-exporter/0.log" Feb 27 01:17:54 crc kubenswrapper[4781]: I0227 01:17:54.557165 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d5923572-3637-49e3-9eea-72e52c5fb88b/ovn-northd/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.078522 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd103c67-d035-4de1-aba9-667d1eb67813/openstack-network-exporter/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.127538 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd103c67-d035-4de1-aba9-667d1eb67813/ovsdbserver-nb/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.278165 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d499c77-ccba-41d1-9efb-8424fc7e8d0e/openstack-network-exporter/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.369375 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d499c77-ccba-41d1-9efb-8424fc7e8d0e/ovsdbserver-sb/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.496381 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d64c6bb46-jcp5p_5ff35aa7-7e5a-4069-8dc4-392e01a957e3/placement-api/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.637162 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d64c6bb46-jcp5p_5ff35aa7-7e5a-4069-8dc4-392e01a957e3/placement-log/0.log" Feb 27 01:17:55 crc kubenswrapper[4781]: I0227 01:17:55.664976 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/init-config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.058391 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.097085 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/init-config-reloader/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.102572 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/prometheus/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.103880 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_85ac8d4c-ca9d-47b1-8f20-c5a66e01d54f/thanos-sidecar/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.333656 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/setup-container/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.575180 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/rabbitmq/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.649860 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/setup-container/0.log" Feb 27 01:17:56 crc kubenswrapper[4781]: I0227 01:17:56.663550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_37519387-1738-4500-9953-52deba3e4a85/setup-container/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.052451 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/rabbitmq/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.111311 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jkjtz_98c901e2-eff5-4256-9add-25d09beb51e3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.131425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ed38e2f2-b350-4abd-abe2-859c9d504aa8/setup-container/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.353003 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4tds4_ca27d369-00b1-47ec-88cc-87d4a7065356/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.355578 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jszvt_05795337-1929-47d6-b63f-96d078b66c47/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.782177 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-n6nts_2a7f1888-0c26-47e0-91b4-fbf07824cab4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:57 crc kubenswrapper[4781]: I0227 01:17:57.872272 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vvrmt_35b9cf19-a1cd-48b5-9072-d5c71680c892/ssh-known-hosts-edpm-deployment/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.308058 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c945d84cf-z5v9s_e8ba5117-540f-448d-aac6-6fde482f5f14/proxy-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.428891 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6n9rn_b9cb72af-e6c8-4fe1-8e4c-9b0d3afb546b/swift-ring-rebalance/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.452033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5c945d84cf-z5v9s_e8ba5117-540f-448d-aac6-6fde482f5f14/proxy-httpd/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.571872 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-auditor/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.628254 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-reaper/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.700432 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-replicator/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.761392 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/account-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.806207 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-auditor/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.907042 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-replicator/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.981300 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-server/0.log" Feb 27 01:17:58 crc kubenswrapper[4781]: I0227 01:17:58.992343 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/container-updater/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.157009 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-auditor/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.182594 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-expirer/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.286123 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-replicator/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.490869 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-server/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.504330 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/rsync/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.519718 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/object-updater/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.627450 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd8fd81c-da11-4f2c-8cf0-18f6d05d7a11/swift-recon-cron/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.854782 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4z7cs_7a6c3903-7dfd-49cd-a92f-d138e10db404/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:17:59 crc kubenswrapper[4781]: I0227 01:17:59.904835 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2cc23bf5-7773-4d33-b2be-2ee2a807f086/tempest-tests-tempest-tests-runner/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.018549 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_083b0010-19f4-4944-a097-96d20dad7eda/test-operator-logs-container/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.111939 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-5wj7j_9f7ced88-662a-42f0-8385-97292a7f4ce4/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.165703 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:00 crc kubenswrapper[4781]: E0227 01:18:00.166213 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.166239 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.166487 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba840bdf-362a-4cad-85e5-3f450bd7f2f5" containerName="container-00" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.167454 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.173247 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.175678 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.175892 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.176158 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.287488 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.389517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.414588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"auto-csr-approver-29535918-hlgxs\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:00 crc kubenswrapper[4781]: I0227 01:18:00.507131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:01 crc kubenswrapper[4781]: I0227 01:18:01.044175 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:18:01 crc kubenswrapper[4781]: I0227 01:18:01.057290 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:18:02 crc kubenswrapper[4781]: I0227 01:18:02.075702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerStarted","Data":"79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474"} Feb 27 01:18:03 crc kubenswrapper[4781]: I0227 01:18:03.086648 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerStarted","Data":"500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b"} Feb 27 01:18:03 crc kubenswrapper[4781]: I0227 01:18:03.107520 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" podStartSLOduration=1.898426119 podStartE2EDuration="3.10749825s" podCreationTimestamp="2026-02-27 01:18:00 +0000 UTC" firstStartedPulling="2026-02-27 01:18:01.056947753 +0000 UTC m=+4350.314487307" lastFinishedPulling="2026-02-27 01:18:02.266019884 +0000 UTC m=+4351.523559438" observedRunningTime="2026-02-27 01:18:03.100339858 +0000 UTC m=+4352.357879422" watchObservedRunningTime="2026-02-27 01:18:03.10749825 +0000 UTC m=+4352.365037814" Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.115995 4781 generic.go:334] "Generic (PLEG): container finished" podID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerID="500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b" exitCode=0 Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.116356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerDied","Data":"500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b"} Feb 27 01:18:04 crc kubenswrapper[4781]: I0227 01:18:04.853119 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_06e98c4a-d812-4e42-b95c-d263e49bf5d3/memcached/0.log" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.714666 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.831470 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") pod \"93462151-bfc8-4c6a-8d83-adc55e0b038c\" (UID: \"93462151-bfc8-4c6a-8d83-adc55e0b038c\") " Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.838693 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t" (OuterVolumeSpecName: "kube-api-access-hz84t") pod "93462151-bfc8-4c6a-8d83-adc55e0b038c" (UID: "93462151-bfc8-4c6a-8d83-adc55e0b038c"). InnerVolumeSpecName "kube-api-access-hz84t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:18:05 crc kubenswrapper[4781]: I0227 01:18:05.933872 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz84t\" (UniqueName: \"kubernetes.io/projected/93462151-bfc8-4c6a-8d83-adc55e0b038c-kube-api-access-hz84t\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" event={"ID":"93462151-bfc8-4c6a-8d83-adc55e0b038c","Type":"ContainerDied","Data":"79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474"} Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137227 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ffb766c3e5304283b457d328e97935eaab5825ccd66c5d31295439b77ab474" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.137612 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-hlgxs" Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.187152 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:18:06 crc kubenswrapper[4781]: I0227 01:18:06.197321 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-gtvv9"] Feb 27 01:18:07 crc kubenswrapper[4781]: I0227 01:18:07.322086 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d6c94d-6b4e-4d64-8c67-eb43c03187c2" path="/var/lib/kubelet/pods/95d6c94d-6b4e-4d64-8c67-eb43c03187c2/volumes" Feb 27 01:18:12 crc kubenswrapper[4781]: I0227 01:18:12.895650 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:18:12 crc kubenswrapper[4781]: I0227 01:18:12.896271 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.390264 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.708378 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.728358 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.769244 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.917957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/extract/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.944859 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/util/0.log" Feb 27 01:18:30 crc kubenswrapper[4781]: I0227 01:18:30.961363 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f58c209d9396d10af42c9ef50a4775a82ae7fb1d9652464d120095b4399x7b_343b5811-baf3-443e-a8fe-074f7b844d14/pull/0.log" Feb 27 01:18:31 crc kubenswrapper[4781]: I0227 01:18:31.355659 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-rn44b_bd77d7fe-85fb-4b16-aa12-75359b52e139/manager/0.log" Feb 27 01:18:31 crc kubenswrapper[4781]: I0227 01:18:31.820670 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-4gl88_66c995b3-f763-455e-8ea3-7dfdfb4c4301/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.015296 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-nfzvw_6739bbb3-bf62-4b1d-8dd7-3accde691e66/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.327007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-fmbwz_c1807c06-6c68-477c-8725-5702e2d59c93/manager/0.log" Feb 27 01:18:32 crc kubenswrapper[4781]: I0227 01:18:32.917841 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-szs2w_513da4ed-be63-45dd-a32a-27ac3ef443a5/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.033156 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-vhmbb_771a50fd-33f6-47ba-ac4a-46da5446cdd8/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.444425 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-2pgf6_057d4c8d-606e-44ea-89ea-fb17b4d63733/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.600922 4781 scope.go:117] "RemoveContainer" containerID="8f787ca4f347bb157c6f5d9ee468bbb739868634c8f4daa10b685f41a5344282" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.603043 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-jnhdb_a4e55d6f-0ca4-466c-80d0-cada3ff9f8ad/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.716393 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-fb2wf_e4d59c4e-1fd2-43d9-8ac2-d162e746e758/manager/0.log" Feb 27 01:18:33 crc kubenswrapper[4781]: I0227 01:18:33.901550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-w5wp5_f777df4b-1040-4f86-a816-ea778b9e5dc3/manager/0.log" Feb 27 01:18:34 crc kubenswrapper[4781]: I0227 01:18:34.166449 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-v5hwb_fe25346c-5f31-478e-a639-060c5958b1eb/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.082007 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-trb7t_e9a3b900-688c-4043-b1ff-53ae1c3ee1d6/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.208146 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-tb298_7d5e1e13-5ce4-48ba-a8c9-3db924e63840/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.379468 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cjq5k4_83466be2-d230-4516-b594-ee56aae3c510/manager/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.721828 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrx6z_f66c974d-5687-42bd-9742-469922240fd5/registry-server/0.log" Feb 27 01:18:35 crc kubenswrapper[4781]: I0227 01:18:35.733369 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-85cf9d4d7d-cl7rb_837579c4-87be-4ce8-94ff-bf25307562db/operator/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.079205 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-bvdd5_fae0f5f8-e721-4ef1-9c8f-4574f156913f/manager/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.198587 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-rn2vt_9d03c922-57fc-4f7d-9e6a-b2b6f3b535d1/manager/0.log" Feb 27 01:18:36 crc kubenswrapper[4781]: I0227 01:18:36.450051 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7gr7g_6d15395c-5ed9-43c8-b7f6-ac16e6e32e70/operator/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.048300 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-5mgl8_3747ddf8-799c-441c-bd9d-4450bdb72382/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.234555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-dc7k2_cf1fe81a-282d-4e51-b8d9-d6569a640985/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.533104 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-gs62l_d31610db-32c1-4c99-9001-ab4504649a75/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.600134 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7677fd857d-kxknf_9fe881c2-cb59-41ce-a23c-f2dcba86d9c3/manager/0.log" Feb 27 01:18:37 crc kubenswrapper[4781]: I0227 01:18:37.699506 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9d678b567-gttml_11361a5e-18c5-448a-8b07-8f5e3245f607/manager/0.log" Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.617033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-rfwpm_fe1f6a92-751f-417e-b2ff-694c10210db7/manager/0.log" Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.895268 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:18:42 crc kubenswrapper[4781]: I0227 01:18:42.895591 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:19:00 crc kubenswrapper[4781]: I0227 01:19:00.898898 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8d9mv_010c6a41-8e2d-4391-ac1b-82814dad98a4/control-plane-machine-set-operator/0.log" Feb 27 01:19:01 crc kubenswrapper[4781]: I0227 01:19:01.107957 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-29z97_77c54f3f-bdb8-42ff-a466-3bfb1e2d9464/kube-rbac-proxy/0.log" Feb 27 01:19:01 crc kubenswrapper[4781]: I0227 01:19:01.142454 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-29z97_77c54f3f-bdb8-42ff-a466-3bfb1e2d9464/machine-api-operator/0.log" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.895401 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.897384 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.897538 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.898515 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:19:12 crc kubenswrapper[4781]: I0227 01:19:12.898672 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" gracePeriod=600 Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.775331 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" exitCode=0 Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.775968 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83"} Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.776003 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} Feb 27 01:19:13 crc kubenswrapper[4781]: I0227 01:19:13.776025 4781 scope.go:117] "RemoveContainer" containerID="75ac173b9ea339577e34023a9278c48ee9a5d5058fde8db84d4fbaa56c9dd3f5" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.224260 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-mwpvm_749ed3fc-65b7-4674-a1b1-0433692d2d89/cert-manager-controller/0.log" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.460727 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rwwkv_b732ab89-7ea1-4378-9511-229ee7fa787f/cert-manager-webhook/0.log" Feb 27 01:19:15 crc kubenswrapper[4781]: I0227 01:19:15.486687 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-95z7d_af9e6ffa-5ea0-473d-9e75-a2715093490f/cert-manager-cainjector/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.012785 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5dzvp_fcd8e350-64e3-4a25-9bc5-cce4888da20a/nmstate-console-plugin/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.198748 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-r6fjq_f7bf5593-bd4f-462d-bcbf-319b075a5116/nmstate-handler/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.336722 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-4d4ds_2b001223-04cf-4a45-843b-e62c5d13ac14/nmstate-metrics/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.367917 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-4d4ds_2b001223-04cf-4a45-843b-e62c5d13ac14/kube-rbac-proxy/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.479519 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-m8kqs_e948619f-a0f4-4463-9076-e593529e4264/nmstate-operator/0.log" Feb 27 01:19:29 crc kubenswrapper[4781]: I0227 01:19:29.591324 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-vrv7p_677ca1f7-513f-4de1-b64b-66b2524b82a1/nmstate-webhook/0.log" Feb 27 01:19:44 crc kubenswrapper[4781]: I0227 01:19:44.956149 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/kube-rbac-proxy/0.log" Feb 27 01:19:45 crc kubenswrapper[4781]: I0227 01:19:45.234196 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/manager/0.log" Feb 27 01:19:59 crc kubenswrapper[4781]: I0227 01:19:59.958160 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rbdmr_c62f5f48-b15f-4d70-837c-a05addc48839/prometheus-operator/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.154834 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:00 crc kubenswrapper[4781]: E0227 01:20:00.155307 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.155326 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.155583 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" containerName="oc" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.156497 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.165984 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.166505 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.167171 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.167386 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.232595 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.314277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.317005 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_cbb658fa-808d-4c87-b81e-63863f31382f/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.417792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.454597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"auto-csr-approver-29535920-hp92r\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.477806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.485246 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m6jxs_3fe8e5f0-6c7b-42bd-9604-85a90477d143/operator/0.log" Feb 27 01:20:00 crc kubenswrapper[4781]: I0227 01:20:00.506564 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l5ppf_1a3a6a15-797e-4cfe-8e21-3a813460012d/perses-operator/0.log" Feb 27 01:20:01 crc kubenswrapper[4781]: I0227 01:20:01.125927 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:20:01 crc kubenswrapper[4781]: I0227 01:20:01.239788 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerStarted","Data":"3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133"} Feb 27 01:20:03 crc kubenswrapper[4781]: I0227 01:20:03.268882 4781 generic.go:334] "Generic (PLEG): container finished" podID="960f5179-f532-4fbf-90fa-e19414cbe684" containerID="7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c" exitCode=0 Feb 27 01:20:03 crc kubenswrapper[4781]: I0227 01:20:03.268941 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerDied","Data":"7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c"} Feb 27 01:20:04 crc kubenswrapper[4781]: I0227 01:20:04.895326 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.025276 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") pod \"960f5179-f532-4fbf-90fa-e19414cbe684\" (UID: \"960f5179-f532-4fbf-90fa-e19414cbe684\") " Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.034961 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7" (OuterVolumeSpecName: "kube-api-access-hgfr7") pod "960f5179-f532-4fbf-90fa-e19414cbe684" (UID: "960f5179-f532-4fbf-90fa-e19414cbe684"). InnerVolumeSpecName "kube-api-access-hgfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.129104 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgfr7\" (UniqueName: \"kubernetes.io/projected/960f5179-f532-4fbf-90fa-e19414cbe684-kube-api-access-hgfr7\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-hp92r" event={"ID":"960f5179-f532-4fbf-90fa-e19414cbe684","Type":"ContainerDied","Data":"3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133"} Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290869 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af6dbf2db555b4c7a2d07f889de88b28f2d2148bb4b56e58f4193e38f778133" Feb 27 01:20:05 crc kubenswrapper[4781]: I0227 01:20:05.290877 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-hp92r" Feb 27 01:20:06 crc kubenswrapper[4781]: I0227 01:20:06.013669 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:20:06 crc kubenswrapper[4781]: I0227 01:20:06.022581 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-n4hsg"] Feb 27 01:20:07 crc kubenswrapper[4781]: I0227 01:20:07.322614 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afc6c2c-4602-4819-bb62-46008ced90dc" path="/var/lib/kubelet/pods/2afc6c2c-4602-4819-bb62-46008ced90dc/volumes" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.715308 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-c6m2v_dc6f679c-913d-4851-b69d-a2e26ebf450a/kube-rbac-proxy/0.log" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.798305 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-c6m2v_dc6f679c-913d-4851-b69d-a2e26ebf450a/controller/0.log" Feb 27 01:20:17 crc kubenswrapper[4781]: I0227 01:20:17.915597 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.106524 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.124065 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.125981 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.188361 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.430018 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.444555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.489404 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.511788 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.670608 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-reloader/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.729743 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.770211 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/cp-frr-files/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.774942 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/controller/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.953024 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/frr-metrics/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.964451 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/kube-rbac-proxy/0.log" Feb 27 01:20:18 crc kubenswrapper[4781]: I0227 01:20:18.990203 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/kube-rbac-proxy-frr/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.301858 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/reloader/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.463421 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-cqkgx_31409f77-5542-4376-8d77-c7a018b245b7/frr-k8s-webhook-server/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.660297 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7586d66d7b-59ntk_7020f39f-9738-4625-bd18-e5e4e64f5956/manager/0.log" Feb 27 01:20:19 crc kubenswrapper[4781]: I0227 01:20:19.906694 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58cbff967-5sp8v_fc2d6f99-bd3f-44e8-91fc-6865285089e7/webhook-server/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.155809 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tljmv_5d7e20ea-c069-4c29-9c3f-1ac3404f026c/kube-rbac-proxy/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.664728 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tljmv_5d7e20ea-c069-4c29-9c3f-1ac3404f026c/speaker/0.log" Feb 27 01:20:20 crc kubenswrapper[4781]: I0227 01:20:20.739212 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j2n85_43006307-3a88-4e83-b57f-965df4bd043d/frr/0.log" Feb 27 01:20:33 crc kubenswrapper[4781]: I0227 01:20:33.727904 4781 scope.go:117] "RemoveContainer" containerID="b2b6fac5723bb6bb5cfc84762685d87a6769151aad24d4f3926a5af565d7efe8" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.089447 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.417036 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.427088 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.432393 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.601778 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/extract/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.640570 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/util/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.675082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82xb676_f26f6c49-1028-49bf-9259-4c08b835cfbb/pull/0.log" Feb 27 01:20:36 crc kubenswrapper[4781]: I0227 01:20:36.816699 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.457982 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.461865 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.524350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.707118 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/pull/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.743082 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/util/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.761737 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e8665162qft_b41e2a48-4103-4cf3-be92-92180cbb2510/extract/0.log" Feb 27 01:20:37 crc kubenswrapper[4781]: I0227 01:20:37.892967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.127039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.141269 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.150067 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.318412 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/util/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.379157 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/pull/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.388976 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08kmwm9_d6e87b6c-eb25-4485-b639-6181c0ad86c7/extract/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.507929 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.737064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.744229 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.766842 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.984303 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-utilities/0.log" Feb 27 01:20:38 crc kubenswrapper[4781]: I0227 01:20:38.994189 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.281959 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.512699 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.513101 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.539878 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.653071 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kpswm_9186313b-02fa-4d6f-9394-ab05a9e3d7d4/registry-server/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.840536 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-content/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.840942 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/extract-utilities/0.log" Feb 27 01:20:39 crc kubenswrapper[4781]: I0227 01:20:39.913254 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.460104 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pjpww_5ef2a1c8-c174-456d-adff-2693b022fa83/registry-server/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.474846 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.478558 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.526257 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.621729 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/util/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.658931 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/extract/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.680513 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4skgkv_2112f4cb-1229-4856-b3ec-a882e6fba5a6/pull/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.757233 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h5lrz_672e121e-2b7f-4454-b628-d99032669167/marketplace-operator/0.log" Feb 27 01:20:40 crc kubenswrapper[4781]: I0227 01:20:40.859594 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.065325 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.066539 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.069754 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.235207 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.264702 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.316376 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.460994 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sw9sn_1b6e0f47-560e-4d1a-8414-b65b1a159c68/registry-server/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.564250 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.597099 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.605834 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.822382 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-utilities/0.log" Feb 27 01:20:41 crc kubenswrapper[4781]: I0227 01:20:41.841878 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/extract-content/0.log" Feb 27 01:20:42 crc kubenswrapper[4781]: I0227 01:20:42.323785 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sp6x7_dc9df096-6538-4b50-8536-bfdd5474eece/registry-server/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.668351 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rbdmr_c62f5f48-b15f-4d70-837c-a05addc48839/prometheus-operator/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.709475 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-9b42m_cbb658fa-808d-4c87-b81e-63863f31382f/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.718175 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-76b44fb4b6-txsn7_5abff2aa-f9cb-469e-9a7e-7a6eea64d4db/prometheus-operator-admission-webhook/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.880920 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l5ppf_1a3a6a15-797e-4cfe-8e21-3a813460012d/perses-operator/0.log" Feb 27 01:20:57 crc kubenswrapper[4781]: I0227 01:20:57.893718 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m6jxs_3fe8e5f0-6c7b-42bd-9604-85a90477d143/operator/0.log" Feb 27 01:21:11 crc kubenswrapper[4781]: I0227 01:21:11.056270 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/kube-rbac-proxy/0.log" Feb 27 01:21:11 crc kubenswrapper[4781]: I0227 01:21:11.161073 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6c4cf64b95-qzbxj_1fed4c33-9f3f-486b-8f74-f2d9a09b92be/manager/0.log" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.966368 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:38 crc kubenswrapper[4781]: E0227 01:21:38.974992 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.975012 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.975247 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" containerName="oc" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.976829 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:38 crc kubenswrapper[4781]: I0227 01:21:38.978009 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.123977 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.124317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.124478 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226257 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.226865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.255088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"redhat-operators-7qsmt\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.299662 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:39 crc kubenswrapper[4781]: I0227 01:21:39.802774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.323289 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" exitCode=0 Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.323505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e"} Feb 27 01:21:40 crc kubenswrapper[4781]: I0227 01:21:40.324555 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"414a71779dc04d471984487e92a13a48f31246dd92f455893f57cfae9ff3685e"} Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.344356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.895551 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:21:42 crc kubenswrapper[4781]: I0227 01:21:42.895958 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:21:48 crc kubenswrapper[4781]: I0227 01:21:48.400076 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" exitCode=0 Feb 27 01:21:48 crc kubenswrapper[4781]: I0227 01:21:48.400154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} Feb 27 01:21:49 crc kubenswrapper[4781]: I0227 01:21:49.413218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerStarted","Data":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.300582 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.301390 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.347174 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.366762 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qsmt" podStartSLOduration=12.898298925 podStartE2EDuration="21.366740654s" podCreationTimestamp="2026-02-27 01:21:38 +0000 UTC" firstStartedPulling="2026-02-27 01:21:40.326017376 +0000 UTC m=+4569.583556930" lastFinishedPulling="2026-02-27 01:21:48.794459105 +0000 UTC m=+4578.051998659" observedRunningTime="2026-02-27 01:21:49.435937588 +0000 UTC m=+4578.693477152" watchObservedRunningTime="2026-02-27 01:21:59.366740654 +0000 UTC m=+4588.624280208" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.572619 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:21:59 crc kubenswrapper[4781]: I0227 01:21:59.614715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.154795 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.156465 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.159535 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.159901 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.160142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.166911 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.278509 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.381057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.406561 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"auto-csr-approver-29535922-v2mrc\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: I0227 01:22:00.485977 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:00 crc kubenswrapper[4781]: W0227 01:22:00.990667 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4de45e6_34d0_42f2_a5ef_3db90864a559.slice/crio-8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926 WatchSource:0}: Error finding container 8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926: Status 404 returned error can't find the container with id 8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926 Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.002948 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.549403 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qsmt" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" containerID="cri-o://a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" gracePeriod=2 Feb 27 01:22:01 crc kubenswrapper[4781]: I0227 01:22:01.549810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerStarted","Data":"8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.333282 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.436716 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.437112 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.437169 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") pod \"ee5018ff-4da3-4cec-9b08-6a503f0607de\" (UID: \"ee5018ff-4da3-4cec-9b08-6a503f0607de\") " Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.438330 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities" (OuterVolumeSpecName: "utilities") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.446536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2" (OuterVolumeSpecName: "kube-api-access-ppht2") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "kube-api-access-ppht2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.539482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppht2\" (UniqueName: \"kubernetes.io/projected/ee5018ff-4da3-4cec-9b08-6a503f0607de-kube-api-access-ppht2\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.539937 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560399 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" exitCode=0 Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560502 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qsmt" event={"ID":"ee5018ff-4da3-4cec-9b08-6a503f0607de","Type":"ContainerDied","Data":"414a71779dc04d471984487e92a13a48f31246dd92f455893f57cfae9ff3685e"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560542 4781 scope.go:117] "RemoveContainer" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.560705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qsmt" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.565121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerStarted","Data":"2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2"} Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.582201 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" podStartSLOduration=1.647149111 podStartE2EDuration="2.582177329s" podCreationTimestamp="2026-02-27 01:22:00 +0000 UTC" firstStartedPulling="2026-02-27 01:22:00.993990944 +0000 UTC m=+4590.251530498" lastFinishedPulling="2026-02-27 01:22:01.929019162 +0000 UTC m=+4591.186558716" observedRunningTime="2026-02-27 01:22:02.580617177 +0000 UTC m=+4591.838156731" watchObservedRunningTime="2026-02-27 01:22:02.582177329 +0000 UTC m=+4591.839716883" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.584604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee5018ff-4da3-4cec-9b08-6a503f0607de" (UID: "ee5018ff-4da3-4cec-9b08-6a503f0607de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.587107 4781 scope.go:117] "RemoveContainer" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.610607 4781 scope.go:117] "RemoveContainer" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.634301 4781 scope.go:117] "RemoveContainer" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.634955 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": container with ID starting with a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e not found: ID does not exist" containerID="a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.634986 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e"} err="failed to get container status \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": rpc error: code = NotFound desc = could not find container \"a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e\": container with ID starting with a75eaa372e217d3b569cf2d424be8b931ec1b0cf20200a749bc1e03d42b9196e not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635006 4781 scope.go:117] "RemoveContainer" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.635296 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": container with ID starting with 12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c not found: ID does not exist" containerID="12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635377 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c"} err="failed to get container status \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": rpc error: code = NotFound desc = could not find container \"12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c\": container with ID starting with 12e29071b710bfbe7608ec4a3f3da9c2914482c14babdc35dbcff712e17a443c not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635442 4781 scope.go:117] "RemoveContainer" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: E0227 01:22:02.635819 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": container with ID starting with 6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e not found: ID does not exist" containerID="6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.635847 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e"} err="failed to get container status \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": rpc error: code = NotFound desc = could not find container \"6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e\": container with ID starting with 6b8c1dda66eb925d490cab6dc28608140b5b90bc217496cf56c716861386b03e not found: ID does not exist" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.642564 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee5018ff-4da3-4cec-9b08-6a503f0607de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.903045 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:02 crc kubenswrapper[4781]: I0227 01:22:02.913604 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qsmt"] Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.322386 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" path="/var/lib/kubelet/pods/ee5018ff-4da3-4cec-9b08-6a503f0607de/volumes" Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.578238 4781 generic.go:334] "Generic (PLEG): container finished" podID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerID="2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2" exitCode=0 Feb 27 01:22:03 crc kubenswrapper[4781]: I0227 01:22:03.578285 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerDied","Data":"2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2"} Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.053498 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.191055 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") pod \"a4de45e6-34d0-42f2-a5ef-3db90864a559\" (UID: \"a4de45e6-34d0-42f2-a5ef-3db90864a559\") " Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.197382 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n" (OuterVolumeSpecName: "kube-api-access-xzb6n") pod "a4de45e6-34d0-42f2-a5ef-3db90864a559" (UID: "a4de45e6-34d0-42f2-a5ef-3db90864a559"). InnerVolumeSpecName "kube-api-access-xzb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.293992 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzb6n\" (UniqueName: \"kubernetes.io/projected/a4de45e6-34d0-42f2-a5ef-3db90864a559-kube-api-access-xzb6n\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" event={"ID":"a4de45e6-34d0-42f2-a5ef-3db90864a559","Type":"ContainerDied","Data":"8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926"} Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595145 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8740226de6b5e5ac6030941401dcadd96358bc95163609424d4139c880f66926" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.595150 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-v2mrc" Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.650876 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:22:05 crc kubenswrapper[4781]: I0227 01:22:05.663064 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-5rslt"] Feb 27 01:22:07 crc kubenswrapper[4781]: I0227 01:22:07.321508 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deca34b2-a27c-46b9-bbe3-ac2d08a7a72e" path="/var/lib/kubelet/pods/deca34b2-a27c-46b9-bbe3-ac2d08a7a72e/volumes" Feb 27 01:22:12 crc kubenswrapper[4781]: I0227 01:22:12.895450 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:22:12 crc kubenswrapper[4781]: I0227 01:22:12.895993 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.372321 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373248 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-utilities" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373261 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-utilities" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373279 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-content" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373285 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="extract-content" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373301 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373308 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: E0227 01:22:31.373336 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373342 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373537 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" containerName="oc" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.373564 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5018ff-4da3-4cec-9b08-6a503f0607de" containerName="registry-server" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.375119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.391563 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445712 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.445846 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.547971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548058 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.548566 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.568788 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"redhat-marketplace-zbdrn\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:31 crc kubenswrapper[4781]: I0227 01:22:31.694992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.257339 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874079 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029" exitCode=0 Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029"} Feb 27 01:22:32 crc kubenswrapper[4781]: I0227 01:22:32.874500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"ac148faebdbe3f69c3c48192b284b267a5615dd867270aef97bbadbb2eee1df2"} Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.832912 4781 scope.go:117] "RemoveContainer" containerID="a89d93284b5be38596ce103c331565c9dbf5be828da69afb3c56f041c046abb6" Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.876562 4781 scope.go:117] "RemoveContainer" containerID="5429009dce4ed7561680c8a6236f2fd38e0d3ba334a4b82f95acb92d3f8dce94" Feb 27 01:22:33 crc kubenswrapper[4781]: I0227 01:22:33.911524 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776"} Feb 27 01:22:35 crc kubenswrapper[4781]: I0227 01:22:35.931680 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776" exitCode=0 Feb 27 01:22:35 crc kubenswrapper[4781]: I0227 01:22:35.931759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776"} Feb 27 01:22:36 crc kubenswrapper[4781]: I0227 01:22:36.944302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerStarted","Data":"ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac"} Feb 27 01:22:36 crc kubenswrapper[4781]: I0227 01:22:36.976069 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbdrn" podStartSLOduration=2.477697144 podStartE2EDuration="5.976044528s" podCreationTimestamp="2026-02-27 01:22:31 +0000 UTC" firstStartedPulling="2026-02-27 01:22:32.875874961 +0000 UTC m=+4622.133414515" lastFinishedPulling="2026-02-27 01:22:36.374222345 +0000 UTC m=+4625.631761899" observedRunningTime="2026-02-27 01:22:36.962814894 +0000 UTC m=+4626.220354448" watchObservedRunningTime="2026-02-27 01:22:36.976044528 +0000 UTC m=+4626.233584082" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.697922 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.698655 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:41 crc kubenswrapper[4781]: I0227 01:22:41.742331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.043616 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.096501 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896111 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896460 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.896506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.897307 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:22:42 crc kubenswrapper[4781]: I0227 01:22:42.897371 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" containerID="cri-o://aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" gracePeriod=600 Feb 27 01:22:43 crc kubenswrapper[4781]: E0227 01:22:43.037375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.013657 4781 generic.go:334] "Generic (PLEG): container finished" podID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" exitCode=0 Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.013736 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerDied","Data":"aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f"} Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.015313 4781 scope.go:117] "RemoveContainer" containerID="93857194fe96d9ea4ad88dce6987b56ca3a1bbc406106d6f82950d6a036e6c83" Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.016128 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbdrn" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" containerID="cri-o://ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" gracePeriod=2 Feb 27 01:22:44 crc kubenswrapper[4781]: I0227 01:22:44.016154 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:22:44 crc kubenswrapper[4781]: E0227 01:22:44.016741 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.030078 4781 generic.go:334] "Generic (PLEG): container finished" podID="219ad386-328f-4166-a266-c28815b457f5" containerID="ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" exitCode=0 Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.030136 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac"} Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.307508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.337346 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.349655 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m" (OuterVolumeSpecName: "kube-api-access-f2n4m") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "kube-api-access-f2n4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.441904 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.441990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") pod \"219ad386-328f-4166-a266-c28815b457f5\" (UID: \"219ad386-328f-4166-a266-c28815b457f5\") " Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.445691 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2n4m\" (UniqueName: \"kubernetes.io/projected/219ad386-328f-4166-a266-c28815b457f5-kube-api-access-f2n4m\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.447945 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities" (OuterVolumeSpecName: "utilities") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.486911 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "219ad386-328f-4166-a266-c28815b457f5" (UID: "219ad386-328f-4166-a266-c28815b457f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.547475 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:45 crc kubenswrapper[4781]: I0227 01:22:45.547519 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/219ad386-328f-4166-a266-c28815b457f5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbdrn" event={"ID":"219ad386-328f-4166-a266-c28815b457f5","Type":"ContainerDied","Data":"ac148faebdbe3f69c3c48192b284b267a5615dd867270aef97bbadbb2eee1df2"} Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042225 4781 scope.go:117] "RemoveContainer" containerID="ffeaba9e07befd01e497bb0680cce40a54fb996db73d3bbe5052d97e412313ac" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.042238 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbdrn" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.060982 4781 scope.go:117] "RemoveContainer" containerID="455e0a696ef59f3cb89f3818dfa8c96ddb587193ce3da25f9680c0d5023ff776" Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.082268 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.093976 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbdrn"] Feb 27 01:22:46 crc kubenswrapper[4781]: I0227 01:22:46.393775 4781 scope.go:117] "RemoveContainer" containerID="2462823339fca04753fa339fcde4fab6b795ec9ca5111df0b4be4d7401713029" Feb 27 01:22:47 crc kubenswrapper[4781]: I0227 01:22:47.323492 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219ad386-328f-4166-a266-c28815b457f5" path="/var/lib/kubelet/pods/219ad386-328f-4166-a266-c28815b457f5/volumes" Feb 27 01:22:55 crc kubenswrapper[4781]: I0227 01:22:55.310415 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:22:55 crc kubenswrapper[4781]: E0227 01:22:55.311429 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:09 crc kubenswrapper[4781]: I0227 01:23:09.313777 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:09 crc kubenswrapper[4781]: E0227 01:23:09.320194 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.266535 4781 generic.go:334] "Generic (PLEG): container finished" podID="03276b70-f5f8-486f-beb1-070a017efc66" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" exitCode=0 Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.266638 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vvzsl/must-gather-b97zf" event={"ID":"03276b70-f5f8-486f-beb1-070a017efc66","Type":"ContainerDied","Data":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.267828 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:10 crc kubenswrapper[4781]: I0227 01:23:10.510573 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/gather/0.log" Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.655446 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.656069 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vvzsl/must-gather-b97zf" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" containerID="cri-o://7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" gracePeriod=2 Feb 27 01:23:19 crc kubenswrapper[4781]: I0227 01:23:19.666242 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vvzsl/must-gather-b97zf"] Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.305069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/copy/0.log" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.305895 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371037 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vvzsl_must-gather-b97zf_03276b70-f5f8-486f-beb1-070a017efc66/copy/0.log" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371812 4781 generic.go:334] "Generic (PLEG): container finished" podID="03276b70-f5f8-486f-beb1-070a017efc66" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" exitCode=143 Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371864 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vvzsl/must-gather-b97zf" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.371879 4781 scope.go:117] "RemoveContainer" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.392408 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") pod \"03276b70-f5f8-486f-beb1-070a017efc66\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.392539 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") pod \"03276b70-f5f8-486f-beb1-070a017efc66\" (UID: \"03276b70-f5f8-486f-beb1-070a017efc66\") " Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.396952 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.407141 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r" (OuterVolumeSpecName: "kube-api-access-q6k9r") pod "03276b70-f5f8-486f-beb1-070a017efc66" (UID: "03276b70-f5f8-486f-beb1-070a017efc66"). InnerVolumeSpecName "kube-api-access-q6k9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.495698 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6k9r\" (UniqueName: \"kubernetes.io/projected/03276b70-f5f8-486f-beb1-070a017efc66-kube-api-access-q6k9r\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.544527 4781 scope.go:117] "RemoveContainer" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: E0227 01:23:20.545143 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": container with ID starting with 7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234 not found: ID does not exist" containerID="7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545191 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234"} err="failed to get container status \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": rpc error: code = NotFound desc = could not find container \"7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234\": container with ID starting with 7feeb6638ba83691635c80772fccbd29faea823d283b7f3acec5a9f856081234 not found: ID does not exist" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545224 4781 scope.go:117] "RemoveContainer" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: E0227 01:23:20.545724 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": container with ID starting with 8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81 not found: ID does not exist" containerID="8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.545755 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81"} err="failed to get container status \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": rpc error: code = NotFound desc = could not find container \"8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81\": container with ID starting with 8f029b28b3e1cd59a21aaf191717990342128ebaa1b5d8e4fe5967b88d656b81 not found: ID does not exist" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.582583 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "03276b70-f5f8-486f-beb1-070a017efc66" (UID: "03276b70-f5f8-486f-beb1-070a017efc66"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:20 crc kubenswrapper[4781]: I0227 01:23:20.597410 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/03276b70-f5f8-486f-beb1-070a017efc66-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:21 crc kubenswrapper[4781]: I0227 01:23:21.321241 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:21 crc kubenswrapper[4781]: E0227 01:23:21.321569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:21 crc kubenswrapper[4781]: I0227 01:23:21.326572 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03276b70-f5f8-486f-beb1-070a017efc66" path="/var/lib/kubelet/pods/03276b70-f5f8-486f-beb1-070a017efc66/volumes" Feb 27 01:23:33 crc kubenswrapper[4781]: I0227 01:23:33.312805 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:33 crc kubenswrapper[4781]: E0227 01:23:33.313714 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.015145 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.015989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-content" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016003 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-content" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016026 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016032 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016051 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016057 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016071 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-utilities" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016076 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="extract-utilities" Feb 27 01:23:38 crc kubenswrapper[4781]: E0227 01:23:38.016088 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016093 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016280 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="copy" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016291 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="219ad386-328f-4166-a266-c28815b457f5" containerName="registry-server" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.016300 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="03276b70-f5f8-486f-beb1-070a017efc66" containerName="gather" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.017798 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.031498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152005 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152304 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.152565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.254929 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.255953 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.275386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"certified-operators-6fkgd\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.337730 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:38 crc kubenswrapper[4781]: I0227 01:23:38.894583 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:39 crc kubenswrapper[4781]: I0227 01:23:39.573704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"87d71a1b1a7c9221c815c99101feb08c7fff66f9d9415c51ea1d9ba83fed28e5"} Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.589177 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" exitCode=0 Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.589235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241"} Feb 27 01:23:40 crc kubenswrapper[4781]: I0227 01:23:40.592260 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:23:42 crc kubenswrapper[4781]: I0227 01:23:42.608113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} Feb 27 01:23:45 crc kubenswrapper[4781]: I0227 01:23:45.651876 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" exitCode=0 Feb 27 01:23:45 crc kubenswrapper[4781]: I0227 01:23:45.652078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} Feb 27 01:23:46 crc kubenswrapper[4781]: I0227 01:23:46.665902 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerStarted","Data":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} Feb 27 01:23:46 crc kubenswrapper[4781]: I0227 01:23:46.686006 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fkgd" podStartSLOduration=4.173127472 podStartE2EDuration="9.685990132s" podCreationTimestamp="2026-02-27 01:23:37 +0000 UTC" firstStartedPulling="2026-02-27 01:23:40.591974928 +0000 UTC m=+4689.849514482" lastFinishedPulling="2026-02-27 01:23:46.104837588 +0000 UTC m=+4695.362377142" observedRunningTime="2026-02-27 01:23:46.683583489 +0000 UTC m=+4695.941123043" watchObservedRunningTime="2026-02-27 01:23:46.685990132 +0000 UTC m=+4695.943529676" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.309865 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:23:48 crc kubenswrapper[4781]: E0227 01:23:48.310573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.338122 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.338225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:48 crc kubenswrapper[4781]: I0227 01:23:48.385040 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.386727 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.440078 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:58 crc kubenswrapper[4781]: I0227 01:23:58.774778 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6fkgd" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" containerID="cri-o://8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" gracePeriod=2 Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.495324 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558222 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.558452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") pod \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\" (UID: \"dff525c7-90db-4e5e-b13a-33b5dbfdb372\") " Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.559165 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities" (OuterVolumeSpecName: "utilities") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.564446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj" (OuterVolumeSpecName: "kube-api-access-mx9bj") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "kube-api-access-mx9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.613395 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dff525c7-90db-4e5e-b13a-33b5dbfdb372" (UID: "dff525c7-90db-4e5e-b13a-33b5dbfdb372"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661318 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661372 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff525c7-90db-4e5e-b13a-33b5dbfdb372-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.661397 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx9bj\" (UniqueName: \"kubernetes.io/projected/dff525c7-90db-4e5e-b13a-33b5dbfdb372-kube-api-access-mx9bj\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785404 4781 generic.go:334] "Generic (PLEG): container finished" podID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" exitCode=0 Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785464 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fkgd" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.785479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.786352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fkgd" event={"ID":"dff525c7-90db-4e5e-b13a-33b5dbfdb372","Type":"ContainerDied","Data":"87d71a1b1a7c9221c815c99101feb08c7fff66f9d9415c51ea1d9ba83fed28e5"} Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.786396 4781 scope.go:117] "RemoveContainer" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.807011 4781 scope.go:117] "RemoveContainer" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.824707 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.831090 4781 scope.go:117] "RemoveContainer" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.837676 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6fkgd"] Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.900524 4781 scope.go:117] "RemoveContainer" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.901342 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": container with ID starting with 8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da not found: ID does not exist" containerID="8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.901424 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da"} err="failed to get container status \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": rpc error: code = NotFound desc = could not find container \"8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da\": container with ID starting with 8929a44cf62c709b1eac61572711d63608d9ab7a6552746e926c5dc59fd5a1da not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.901458 4781 scope.go:117] "RemoveContainer" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.901981 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": container with ID starting with c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5 not found: ID does not exist" containerID="c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902025 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5"} err="failed to get container status \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": rpc error: code = NotFound desc = could not find container \"c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5\": container with ID starting with c33a66356f1532b41e23bd86d1556874b2f5f41ccd7f120a453980f02226dea5 not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902053 4781 scope.go:117] "RemoveContainer" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: E0227 01:23:59.902467 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": container with ID starting with a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241 not found: ID does not exist" containerID="a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241" Feb 27 01:23:59 crc kubenswrapper[4781]: I0227 01:23:59.902521 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241"} err="failed to get container status \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": rpc error: code = NotFound desc = could not find container \"a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241\": container with ID starting with a4dd1bff343244df3143e30e3150b470952a44f6875f8674a3d4412b08e46241 not found: ID does not exist" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.147408 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148218 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-utilities" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148242 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-utilities" Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148261 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-content" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148268 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="extract-content" Feb 27 01:24:00 crc kubenswrapper[4781]: E0227 01:24:00.148287 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148294 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.148546 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" containerName="registry-server" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.149348 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.152244 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.152554 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.153172 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.169397 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.275598 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.379526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.399103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"auto-csr-approver-29535924-jqt2s\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:00 crc kubenswrapper[4781]: I0227 01:24:00.481241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.334788 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:01 crc kubenswrapper[4781]: E0227 01:24:01.336465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.334812 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff525c7-90db-4e5e-b13a-33b5dbfdb372" path="/var/lib/kubelet/pods/dff525c7-90db-4e5e-b13a-33b5dbfdb372/volumes" Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.438770 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:24:01 crc kubenswrapper[4781]: W0227 01:24:01.441979 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4094f7_d1b4_4771_8d18_ea76f4f2afc6.slice/crio-a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c WatchSource:0}: Error finding container a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c: Status 404 returned error can't find the container with id a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c Feb 27 01:24:01 crc kubenswrapper[4781]: I0227 01:24:01.807798 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerStarted","Data":"a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c"} Feb 27 01:24:03 crc kubenswrapper[4781]: I0227 01:24:03.828424 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerID="ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d" exitCode=0 Feb 27 01:24:03 crc kubenswrapper[4781]: I0227 01:24:03.828483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerDied","Data":"ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d"} Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.372379 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.487569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") pod \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\" (UID: \"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6\") " Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.496260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz" (OuterVolumeSpecName: "kube-api-access-fdfnz") pod "5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" (UID: "5d4094f7-d1b4-4771-8d18-ea76f4f2afc6"). InnerVolumeSpecName "kube-api-access-fdfnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.590578 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdfnz\" (UniqueName: \"kubernetes.io/projected/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6-kube-api-access-fdfnz\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" event={"ID":"5d4094f7-d1b4-4771-8d18-ea76f4f2afc6","Type":"ContainerDied","Data":"a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c"} Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849189 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3993c2ada8a174086507fb13e67deb1f2d4005fa3eaa96d5e20fa8862acfd4c" Feb 27 01:24:05 crc kubenswrapper[4781]: I0227 01:24:05.849200 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-jqt2s" Feb 27 01:24:06 crc kubenswrapper[4781]: I0227 01:24:06.449478 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:24:06 crc kubenswrapper[4781]: I0227 01:24:06.461753 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-hlgxs"] Feb 27 01:24:07 crc kubenswrapper[4781]: I0227 01:24:07.321832 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93462151-bfc8-4c6a-8d83-adc55e0b038c" path="/var/lib/kubelet/pods/93462151-bfc8-4c6a-8d83-adc55e0b038c/volumes" Feb 27 01:24:14 crc kubenswrapper[4781]: I0227 01:24:14.309853 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:14 crc kubenswrapper[4781]: E0227 01:24:14.310699 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.291223 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:19 crc kubenswrapper[4781]: E0227 01:24:19.292157 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.292170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.292365 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" containerName="oc" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.293849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.324960 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.383819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.384032 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.384053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.485966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.486368 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.486443 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.510402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"community-operators-8jb27\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:19 crc kubenswrapper[4781]: I0227 01:24:19.618588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.266059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.993990 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" exitCode=0 Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.994088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20"} Feb 27 01:24:20 crc kubenswrapper[4781]: I0227 01:24:20.994127 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"5c47e43f064528bc2831a2b9ccb5e3b1b6a8041ff6b8a9408e4fb27c0e6a7ceb"} Feb 27 01:24:23 crc kubenswrapper[4781]: I0227 01:24:23.013881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} Feb 27 01:24:25 crc kubenswrapper[4781]: I0227 01:24:25.056705 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" exitCode=0 Feb 27 01:24:25 crc kubenswrapper[4781]: I0227 01:24:25.057310 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} Feb 27 01:24:26 crc kubenswrapper[4781]: I0227 01:24:26.069665 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerStarted","Data":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} Feb 27 01:24:26 crc kubenswrapper[4781]: I0227 01:24:26.090925 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jb27" podStartSLOduration=2.613274467 podStartE2EDuration="7.090902046s" podCreationTimestamp="2026-02-27 01:24:19 +0000 UTC" firstStartedPulling="2026-02-27 01:24:20.995835605 +0000 UTC m=+4730.253375159" lastFinishedPulling="2026-02-27 01:24:25.473463184 +0000 UTC m=+4734.731002738" observedRunningTime="2026-02-27 01:24:26.084737783 +0000 UTC m=+4735.342277337" watchObservedRunningTime="2026-02-27 01:24:26.090902046 +0000 UTC m=+4735.348441620" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.309271 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:29 crc kubenswrapper[4781]: E0227 01:24:29.309874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.619442 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.619512 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:29 crc kubenswrapper[4781]: I0227 01:24:29.666602 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:30 crc kubenswrapper[4781]: I0227 01:24:30.150838 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:30 crc kubenswrapper[4781]: I0227 01:24:30.206133 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.130711 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jb27" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" containerID="cri-o://15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" gracePeriod=2 Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.804085 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996621 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.996746 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") pod \"c4ead436-ddbf-4703-971c-12f3b1a5673e\" (UID: \"c4ead436-ddbf-4703-971c-12f3b1a5673e\") " Feb 27 01:24:32 crc kubenswrapper[4781]: I0227 01:24:32.997663 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities" (OuterVolumeSpecName: "utilities") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.002585 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c" (OuterVolumeSpecName: "kube-api-access-zmp5c") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "kube-api-access-zmp5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.099060 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.099118 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmp5c\" (UniqueName: \"kubernetes.io/projected/c4ead436-ddbf-4703-971c-12f3b1a5673e-kube-api-access-zmp5c\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144430 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" exitCode=0 Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144539 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.145761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jb27" event={"ID":"c4ead436-ddbf-4703-971c-12f3b1a5673e","Type":"ContainerDied","Data":"5c47e43f064528bc2831a2b9ccb5e3b1b6a8041ff6b8a9408e4fb27c0e6a7ceb"} Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.144556 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jb27" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.145832 4781 scope.go:117] "RemoveContainer" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.167419 4781 scope.go:117] "RemoveContainer" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.195525 4781 scope.go:117] "RemoveContainer" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.246854 4781 scope.go:117] "RemoveContainer" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247328 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": container with ID starting with 15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db not found: ID does not exist" containerID="15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247362 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db"} err="failed to get container status \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": rpc error: code = NotFound desc = could not find container \"15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db\": container with ID starting with 15ccfc11009ddcff78073940c68304903ce3d786e9632dd5c7e1406271e395db not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247385 4781 scope.go:117] "RemoveContainer" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247696 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": container with ID starting with 128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d not found: ID does not exist" containerID="128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247730 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d"} err="failed to get container status \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": rpc error: code = NotFound desc = could not find container \"128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d\": container with ID starting with 128e971276569150918df5a591133a0d2e56dc20c3cb9da2466c97c5aa32ed0d not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.247748 4781 scope.go:117] "RemoveContainer" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: E0227 01:24:33.247976 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": container with ID starting with e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20 not found: ID does not exist" containerID="e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.248006 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20"} err="failed to get container status \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": rpc error: code = NotFound desc = could not find container \"e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20\": container with ID starting with e19a34d1785705dd57fd1093c2c39f6e187bac1f564a1cab54fd3c90f5370b20 not found: ID does not exist" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.294450 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ead436-ddbf-4703-971c-12f3b1a5673e" (UID: "c4ead436-ddbf-4703-971c-12f3b1a5673e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.302581 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ead436-ddbf-4703-971c-12f3b1a5673e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.481364 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:33 crc kubenswrapper[4781]: I0227 01:24:33.492240 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jb27"] Feb 27 01:24:34 crc kubenswrapper[4781]: I0227 01:24:34.035998 4781 scope.go:117] "RemoveContainer" containerID="500185c8a41f1ea03fad4eed8ceeb62b2a655600fefd254d6835b485744f3e8b" Feb 27 01:24:35 crc kubenswrapper[4781]: I0227 01:24:35.319901 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" path="/var/lib/kubelet/pods/c4ead436-ddbf-4703-971c-12f3b1a5673e/volumes" Feb 27 01:24:43 crc kubenswrapper[4781]: I0227 01:24:43.310435 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:43 crc kubenswrapper[4781]: E0227 01:24:43.311303 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:24:58 crc kubenswrapper[4781]: I0227 01:24:58.309378 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:24:58 crc kubenswrapper[4781]: E0227 01:24:58.310279 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:12 crc kubenswrapper[4781]: I0227 01:25:12.309202 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:12 crc kubenswrapper[4781]: E0227 01:25:12.310074 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:27 crc kubenswrapper[4781]: I0227 01:25:27.310771 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:27 crc kubenswrapper[4781]: E0227 01:25:27.311988 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:38 crc kubenswrapper[4781]: I0227 01:25:38.309678 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:38 crc kubenswrapper[4781]: E0227 01:25:38.310495 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:25:52 crc kubenswrapper[4781]: I0227 01:25:52.309951 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:25:52 crc kubenswrapper[4781]: E0227 01:25:52.310796 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.149300 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150483 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-content" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150532 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-content" Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150557 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-utilities" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150564 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="extract-utilities" Feb 27 01:26:00 crc kubenswrapper[4781]: E0227 01:26:00.150580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150591 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.150886 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ead436-ddbf-4703-971c-12f3b1a5673e" containerName="registry-server" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.151809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.154132 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.154395 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.155306 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.159984 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.336929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.438985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.463430 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"auto-csr-approver-29535926-dk95r\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.473840 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:00 crc kubenswrapper[4781]: I0227 01:26:00.960663 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-dk95r"] Feb 27 01:26:01 crc kubenswrapper[4781]: I0227 01:26:01.008910 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerStarted","Data":"5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69"} Feb 27 01:26:03 crc kubenswrapper[4781]: I0227 01:26:03.030104 4781 generic.go:334] "Generic (PLEG): container finished" podID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerID="16a9f9d6ea8b6379b17f302340277b5de7121ea70fcd8873f81764601e863272" exitCode=0 Feb 27 01:26:03 crc kubenswrapper[4781]: I0227 01:26:03.030169 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerDied","Data":"16a9f9d6ea8b6379b17f302340277b5de7121ea70fcd8873f81764601e863272"} Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.581947 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.733247 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") pod \"4b2699a8-ed99-4729-8537-c56d4e3020a5\" (UID: \"4b2699a8-ed99-4729-8537-c56d4e3020a5\") " Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.750780 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88" (OuterVolumeSpecName: "kube-api-access-8bv88") pod "4b2699a8-ed99-4729-8537-c56d4e3020a5" (UID: "4b2699a8-ed99-4729-8537-c56d4e3020a5"). InnerVolumeSpecName "kube-api-access-8bv88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:04 crc kubenswrapper[4781]: I0227 01:26:04.835569 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bv88\" (UniqueName: \"kubernetes.io/projected/4b2699a8-ed99-4729-8537-c56d4e3020a5-kube-api-access-8bv88\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050089 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-dk95r" event={"ID":"4b2699a8-ed99-4729-8537-c56d4e3020a5","Type":"ContainerDied","Data":"5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69"} Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050124 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-dk95r" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.050127 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b83dca4856c933ef6979aa2b51369b348695367cf72417555ffefd24a1d1e69" Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.651235 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:26:05 crc kubenswrapper[4781]: I0227 01:26:05.661020 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-hp92r"] Feb 27 01:26:07 crc kubenswrapper[4781]: I0227 01:26:07.310025 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:07 crc kubenswrapper[4781]: E0227 01:26:07.310364 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:07 crc kubenswrapper[4781]: I0227 01:26:07.321513 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="960f5179-f532-4fbf-90fa-e19414cbe684" path="/var/lib/kubelet/pods/960f5179-f532-4fbf-90fa-e19414cbe684/volumes" Feb 27 01:26:18 crc kubenswrapper[4781]: I0227 01:26:18.310388 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:18 crc kubenswrapper[4781]: E0227 01:26:18.311348 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:31 crc kubenswrapper[4781]: I0227 01:26:31.318759 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:31 crc kubenswrapper[4781]: E0227 01:26:31.319896 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:26:34 crc kubenswrapper[4781]: I0227 01:26:34.377882 4781 scope.go:117] "RemoveContainer" containerID="7e0241ade9afef50720d7328e1e27817a80d46d9126df5c01d0b6695a2c96b4c" Feb 27 01:26:45 crc kubenswrapper[4781]: I0227 01:26:45.309574 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:26:45 crc kubenswrapper[4781]: E0227 01:26:45.310429 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:00 crc kubenswrapper[4781]: I0227 01:27:00.309062 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:00 crc kubenswrapper[4781]: E0227 01:27:00.309856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:15 crc kubenswrapper[4781]: I0227 01:27:15.310281 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:15 crc kubenswrapper[4781]: E0227 01:27:15.311408 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:27 crc kubenswrapper[4781]: I0227 01:27:27.309393 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:27 crc kubenswrapper[4781]: E0227 01:27:27.310163 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:40 crc kubenswrapper[4781]: I0227 01:27:40.309371 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:40 crc kubenswrapper[4781]: E0227 01:27:40.310246 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v6fnj_openshift-machine-config-operator(32c19e2e-0830-47a5-9ea8-862e1c9d8571)\"" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" Feb 27 01:27:54 crc kubenswrapper[4781]: I0227 01:27:54.309364 4781 scope.go:117] "RemoveContainer" containerID="aa65480f2fbf04a45081ba34adf2d2c9cbf45bb721556fefc3482fe55dcfcd3f" Feb 27 01:27:55 crc kubenswrapper[4781]: I0227 01:27:55.114912 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" event={"ID":"32c19e2e-0830-47a5-9ea8-862e1c9d8571","Type":"ContainerStarted","Data":"3ca68bdb706f059286e9cab162cb1e9da3e558b32a0ae147b05ecbdd73deb984"} Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.144471 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:00 crc kubenswrapper[4781]: E0227 01:28:00.145442 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.145456 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.145719 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2699a8-ed99-4729-8537-c56d4e3020a5" containerName="oc" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.146533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.148714 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.149118 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.149029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.156018 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.222544 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.323741 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.343095 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"auto-csr-approver-29535928-h2h9s\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.471078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:00 crc kubenswrapper[4781]: I0227 01:28:00.952764 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-h2h9s"] Feb 27 01:28:01 crc kubenswrapper[4781]: I0227 01:28:01.169009 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerStarted","Data":"e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6"} Feb 27 01:28:03 crc kubenswrapper[4781]: I0227 01:28:03.198470 4781 generic.go:334] "Generic (PLEG): container finished" podID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerID="b8699f3e4280117c3b1dd81cc2aea325b5c92d940a277b35186feed5736aed65" exitCode=0 Feb 27 01:28:03 crc kubenswrapper[4781]: I0227 01:28:03.198940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerDied","Data":"b8699f3e4280117c3b1dd81cc2aea325b5c92d940a277b35186feed5736aed65"} Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.804923 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.826858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") pod \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\" (UID: \"7afe05a2-70dc-4ef0-93a1-af0fee2fd536\") " Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.833363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4" (OuterVolumeSpecName: "kube-api-access-pdbn4") pod "7afe05a2-70dc-4ef0-93a1-af0fee2fd536" (UID: "7afe05a2-70dc-4ef0-93a1-af0fee2fd536"). InnerVolumeSpecName "kube-api-access-pdbn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:04 crc kubenswrapper[4781]: I0227 01:28:04.929499 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbn4\" (UniqueName: \"kubernetes.io/projected/7afe05a2-70dc-4ef0-93a1-af0fee2fd536-kube-api-access-pdbn4\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.224695 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" event={"ID":"7afe05a2-70dc-4ef0-93a1-af0fee2fd536","Type":"ContainerDied","Data":"e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6"} Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.225165 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e288234e3a2fb7071d5906564e8fd676f6e72bef4cba6fce7f10e2a518fa17f6" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.224750 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-h2h9s" Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.871725 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:28:05 crc kubenswrapper[4781]: I0227 01:28:05.882319 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-v2mrc"] Feb 27 01:28:07 crc kubenswrapper[4781]: I0227 01:28:07.329201 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4de45e6-34d0-42f2-a5ef-3db90864a559" path="/var/lib/kubelet/pods/a4de45e6-34d0-42f2-a5ef-3db90864a559/volumes" Feb 27 01:28:34 crc kubenswrapper[4781]: I0227 01:28:34.476799 4781 scope.go:117] "RemoveContainer" containerID="2f2dfc7d2070f93d2a77b14a366a4665e36d3d475590bb3af0ca699d7f8bebd2" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.167963 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:00 crc kubenswrapper[4781]: E0227 01:30:00.169269 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.169290 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.169569 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afe05a2-70dc-4ef0-93a1-af0fee2fd536" containerName="oc" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.170687 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175391 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175706 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rcqpr" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.175724 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.187110 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.188820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.191807 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.192206 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.202120 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.216242 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.254720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.254989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.255054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.255345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357467 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.357921 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.361536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.366603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.381468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"auto-csr-approver-29535930-n6tx4\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.382337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"collect-profiles-29535930-9q6h7\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.498485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:00 crc kubenswrapper[4781]: I0227 01:30:00.518182 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.157044 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7"] Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.530514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerStarted","Data":"4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb"} Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.675519 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-n6tx4"] Feb 27 01:30:01 crc kubenswrapper[4781]: W0227 01:30:01.682803 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode06cf708_30fa_49dc_adab_a1f2c7990710.slice/crio-15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e WatchSource:0}: Error finding container 15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e: Status 404 returned error can't find the container with id 15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e Feb 27 01:30:01 crc kubenswrapper[4781]: I0227 01:30:01.688959 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.548952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerStarted","Data":"15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e"} Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.555889 4781 generic.go:334] "Generic (PLEG): container finished" podID="86366445-86a7-4d4e-9227-3f30e17f16a8" containerID="4911eafe22c06c1a3e360ba662d163448b23902834655fe9066930069a16df01" exitCode=0 Feb 27 01:30:02 crc kubenswrapper[4781]: I0227 01:30:02.555956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerDied","Data":"4911eafe22c06c1a3e360ba662d163448b23902834655fe9066930069a16df01"} Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.204212 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.271970 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.272129 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.272156 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") pod \"86366445-86a7-4d4e-9227-3f30e17f16a8\" (UID: \"86366445-86a7-4d4e-9227-3f30e17f16a8\") " Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.273701 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.280556 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.281034 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl" (OuterVolumeSpecName: "kube-api-access-7hlsl") pod "86366445-86a7-4d4e-9227-3f30e17f16a8" (UID: "86366445-86a7-4d4e-9227-3f30e17f16a8"). InnerVolumeSpecName "kube-api-access-7hlsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375009 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86366445-86a7-4d4e-9227-3f30e17f16a8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375051 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hlsl\" (UniqueName: \"kubernetes.io/projected/86366445-86a7-4d4e-9227-3f30e17f16a8-kube-api-access-7hlsl\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.375063 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86366445-86a7-4d4e-9227-3f30e17f16a8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.583714 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" event={"ID":"86366445-86a7-4d4e-9227-3f30e17f16a8","Type":"ContainerDied","Data":"4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb"} Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.584074 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4852cc38ee40f54d89db92a7646463e89e6392f011efd9bad64c5c8b6ff1b3eb" Feb 27 01:30:04 crc kubenswrapper[4781]: I0227 01:30:04.584140 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-9q6h7" Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.299152 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.324321 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535885-nfxm2"] Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.600127 4781 generic.go:334] "Generic (PLEG): container finished" podID="e06cf708-30fa-49dc-adab-a1f2c7990710" containerID="e1d561aa4537e2a7029bf8a15452492b8da6a5c737b211fb44b195f7580f906b" exitCode=0 Feb 27 01:30:05 crc kubenswrapper[4781]: I0227 01:30:05.600195 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerDied","Data":"e1d561aa4537e2a7029bf8a15452492b8da6a5c737b211fb44b195f7580f906b"} Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.300644 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.321668 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db" path="/var/lib/kubelet/pods/a6a4cfa6-d06d-4cc8-b77f-83b25ca8c6db/volumes" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.349176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") pod \"e06cf708-30fa-49dc-adab-a1f2c7990710\" (UID: \"e06cf708-30fa-49dc-adab-a1f2c7990710\") " Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.359240 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz" (OuterVolumeSpecName: "kube-api-access-hxspz") pod "e06cf708-30fa-49dc-adab-a1f2c7990710" (UID: "e06cf708-30fa-49dc-adab-a1f2c7990710"). InnerVolumeSpecName "kube-api-access-hxspz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.452569 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxspz\" (UniqueName: \"kubernetes.io/projected/e06cf708-30fa-49dc-adab-a1f2c7990710-kube-api-access-hxspz\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625434 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" event={"ID":"e06cf708-30fa-49dc-adab-a1f2c7990710","Type":"ContainerDied","Data":"15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e"} Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625539 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fe8717e4c838968b868e49e039958b88e62afd1f8d3bcd1ab02b91aa6a520e" Feb 27 01:30:07 crc kubenswrapper[4781]: I0227 01:30:07.625544 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-n6tx4" Feb 27 01:30:08 crc kubenswrapper[4781]: I0227 01:30:08.369941 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:30:08 crc kubenswrapper[4781]: I0227 01:30:08.381750 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-jqt2s"] Feb 27 01:30:09 crc kubenswrapper[4781]: I0227 01:30:09.323253 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4094f7-d1b4-4771-8d18-ea76f4f2afc6" path="/var/lib/kubelet/pods/5d4094f7-d1b4-4771-8d18-ea76f4f2afc6/volumes" Feb 27 01:30:12 crc kubenswrapper[4781]: I0227 01:30:12.895302 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:12 crc kubenswrapper[4781]: I0227 01:30:12.895758 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:30:34 crc kubenswrapper[4781]: I0227 01:30:34.584552 4781 scope.go:117] "RemoveContainer" containerID="ddf9edbab834e6c1a55e9fec04cd3a8734214a1dea92a8471be43663e726da2d" Feb 27 01:30:34 crc kubenswrapper[4781]: I0227 01:30:34.643965 4781 scope.go:117] "RemoveContainer" containerID="13bcf8d94b2a16937b07dfe8f4ce503b88a240b7d9c23876edfc03e06b4dceeb" Feb 27 01:30:42 crc kubenswrapper[4781]: I0227 01:30:42.895437 4781 patch_prober.go:28] interesting pod/machine-config-daemon-v6fnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:42 crc kubenswrapper[4781]: I0227 01:30:42.895989 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v6fnj" podUID="32c19e2e-0830-47a5-9ea8-862e1c9d8571" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"